Search results for: affective-analytical decision framework
7014 From Proficiency to High Accomplishment: Transformative Inquiry and Institutionalization of Mentoring Practices in Teacher Education in South-Western Nigeria
Authors: Michael A. Ifarajimi
Abstract:
The transition from being a graduate teacher to a highly accomplished teacher has been widely portrayed in literature as challenging. Pre-service teachers are troubled with complex issues such as implementing, assessment, meeting prescribed learning outcomes, taking risks, supporting eco sustainability, etc. This list is not exhaustive as they are further complicated when the concerns extend beyond the classroom into the broader school setting and community. Meanwhile, the pre-service teacher education programme as is currently run in Nigeria, cannot adequately prepare newly trained teachers for the realities of classroom teaching. And there appears to be no formal structure in place for mentoring such teachers by the more seasoned teachers in schools. The central research question of the study, therefore, is which institutional framework can be distinguished for enactment in mentoring practices in teacher education? The study was conducted in five colleges of education in South-West Nigeria, and a sample of 1000 pre-service teachers on their final year practicum was randomly selected from the colleges of education. A pre-service teacher mentorship programme (PTMP) framework was designed and implemented, with a focus on the impact of transformative inquiry on the pre-service teacher support system. The study discovered a significant impact of mentoring on pre-service teacher’s professional transformation. The study concluded that institutionalizing mentorship through transformative inquiry is a means to sustainable teacher education, professional growth, and effective classroom practice. The study recommended that the government should enact policies that will promote mentoring in teacher education and establish a framework for the implementation of mentoring practices in the colleges of education in Nigeria.Keywords: institutionalization, mentoring, pre-service teachers teacher education, transformative inquiry
Procedia PDF Downloads 1337013 Automatic Detection of Traffic Stop Locations Using GPS Data
Authors: Areej Salaymeh, Loren Schwiebert, Stephen Remias, Jonathan Waddell
Abstract:
Extracting information from new data sources has emerged as a crucial task in many traffic planning processes, such as identifying traffic patterns, route planning, traffic forecasting, and locating infrastructure improvements. Given the advanced technologies used to collect Global Positioning System (GPS) data from dedicated GPS devices, GPS equipped phones, and navigation tools, intelligent data analysis methodologies are necessary to mine this raw data. In this research, an automatic detection framework is proposed to help identify and classify the locations of stopped GPS waypoints into two main categories: signalized intersections or highway congestion. The Delaunay triangulation is used to perform this assessment in the clustering phase. While most of the existing clustering algorithms need assumptions about the data distribution, the effectiveness of the Delaunay triangulation relies on triangulating geographical data points without such assumptions. Our proposed method starts by cleaning noise from the data and normalizing it. Next, the framework will identify stoppage points by calculating the traveled distance. The last step is to use clustering to form groups of waypoints for signalized traffic and highway congestion. Next, a binary classifier was applied to find distinguish highway congestion from signalized stop points. The binary classifier uses the length of the cluster to find congestion. The proposed framework shows high accuracy for identifying the stop positions and congestion points in around 99.2% of trials. We show that it is possible, using limited GPS data, to distinguish with high accuracy.Keywords: Delaunay triangulation, clustering, intelligent transportation systems, GPS data
Procedia PDF Downloads 2757012 Artificial Intelligence Impact on the Australian Government Public Sector
Authors: Jessica Ho
Abstract:
AI has helped government, businesses and industries transform the way they do things. AI is used in automating tasks to improve decision-making and efficiency. AI is embedded in sensors and used in automation to help save time and eliminate human errors in repetitive tasks. Today, we saw the growth in AI using the collection of vast amounts of data to forecast with greater accuracy, inform decision-making, adapt to changing market conditions and offer more personalised service based on consumer habits and preferences. Government around the world share the opportunity to leverage these disruptive technologies to improve productivity while reducing costs. In addition, these intelligent solutions can also help streamline government processes to deliver more seamless and intuitive user experiences for employees and citizens. This is a critical challenge for NSW Government as we are unable to determine the risk that is brought by the unprecedented pace of adoption of AI solutions in government. Government agencies must ensure that their use of AI complies with relevant laws and regulatory requirements, including those related to data privacy and security. Furthermore, there will always be ethical concerns surrounding the use of AI, such as the potential for bias, intellectual property rights and its impact on job security. Within NSW’s public sector, agencies are already testing AI for crowd control, infrastructure management, fraud compliance, public safety, transport, and police surveillance. Citizens are also attracted to the ease of use and accessibility of AI solutions without requiring specialised technical skills. This increased accessibility also comes with balancing a higher risk and exposure to the health and safety of citizens. On the other side, public agencies struggle with keeping up with this pace while minimising risks, but the low entry cost and open-source nature of generative AI led to a rapid increase in the development of AI powered apps organically – “There is an AI for That” in Government. Other challenges include the fact that there appeared to be no legislative provisions that expressly authorise the NSW Government to use an AI to make decision. On the global stage, there were too many actors in the regulatory space, and a sovereign response is needed to minimise multiplicity and regulatory burden. Therefore, traditional corporate risk and governance framework and regulation and legislation frameworks will need to be evaluated for AI unique challenges due to their rapidly evolving nature, ethical considerations, and heightened regulatory scrutiny impacting the safety of consumers and increased risks for Government. Creating an effective, efficient NSW Government’s governance regime, adapted to the range of different approaches to the applications of AI, is not a mere matter of overcoming technical challenges. Technologies have a wide range of social effects on our surroundings and behaviours. There is compelling evidence to show that Australia's sustained social and economic advancement depends on AI's ability to spur economic growth, boost productivity, and address a wide range of societal and political issues. AI may also inflict significant damage. If such harm is not addressed, the public's confidence in this kind of innovation will be weakened. This paper suggests several AI regulatory approaches for consideration that is forward-looking and agile while simultaneously fostering innovation and human rights. The anticipated outcome is to ensure that NSW Government matches the rising levels of innovation in AI technologies with the appropriate and balanced innovation in AI governance.Keywords: artificial inteligence, machine learning, rules, governance, government
Procedia PDF Downloads 707011 An Artificial Intelligence Framework to Forecast Air Quality
Authors: Richard Ren
Abstract:
Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms
Procedia PDF Downloads 1277010 Framework Development of Carbon Management Software Tool in Sustainable Supply Chain Management of Indian Industry
Authors: Sarbjit Singh
Abstract:
This framework development explored the status of GSCM in manufacturing SMEs and concluded that there was a significant gap w.r.t carbon emissions measurement in the supply chain activities. The measurement of carbon emissions within supply chains is important green initiative toward its reduction. The majority of the SMEs were facing the problem to quantify the green house gas emissions in its supply chain & to make it a low carbon supply chain or GSCM. Thus, the carbon management initiatives were amalgamated with the supply chain activities in order to measure and reduce the carbon emissions, confirming the GHG protocol scopes. Henceforth, it covers the development of carbon management software (CMS) tool to quantify carbon emissions for effective carbon management. This tool is cheap and easy to use for the industries for the management of their carbon emissions within the supply chain.Keywords: w.r.t carbon emissions, carbon management software, supply chain management, Indian Industry
Procedia PDF Downloads 4697009 Modular Data and Calculation Framework for a Technology-based Mapping of the Manufacturing Process According to the Value Stream Management Approach
Authors: Tim Wollert, Fabian Behrendt
Abstract:
Value Stream Management (VSM) is a widely used methodology in the context of Lean Management for improving end-to-end material and information flows from a supplier to a customer from a company’s perspective. Whereas the design principles, e.g. Pull, value-adding, customer-orientation and further ones are still valid against the background of an increasing digitalized and dynamic environment, the methodology itself for mapping a value stream is characterized as time- and resource-intensive due to the high degree of manual activities. The digitalization of processes in the context of Industry 4.0 enables new opportunities to reduce these manual efforts and make the VSM approach more agile. The paper at hand aims at providing a modular data and calculation framework, utilizing the available business data, provided by information and communication technologies for automizing the value stream mapping process with focus on the manufacturing process.Keywords: lean management 4.0, value stream management (VSM) 4.0, dynamic value stream mapping, enterprise resource planning (ERP)
Procedia PDF Downloads 1507008 China-Africa Diplomatic Discourse: Reconstructing the Principle of “Yi” as a Framework for Analyzing Sino-Africa Cooperation
Authors: Modestus Queen
Abstract:
As we know, diplomatic languages carry the political ideology and cultural stance of the country. Knowing that China's diplomatic discourse is complicated and is heavily flavored with Chinese characteristics, one of the core goals of President Xi's administration is to properly tell the story of China. This cannot be done without proper translation or interpretation of major Chinese diplomatic concepts. Therefore, this research seeks to interpret the relevance of "Yi" as used in "Zhèngquè Yì Lì Guān". The author argues that it is not enough to translate a document but that it must be properly interpreted to portray it as political, economic, cultural and diplomatic relevant to the target audience, in this case, African people. The first finding in the current study indicates that literal translation is a bad strategy, especially in Chinese diplomatic discourses. The second finding indicates that "Yi" can be used as a framework to analyze Sino-Africa relations from economic, social and political perspectives, and the third finding indicates that "Yi" is the guiding principle of China's foreign policy towards Africa.Keywords: Yi, justice, China-Africa, interpretation, diplomatic discourse, discourse reconstruction
Procedia PDF Downloads 1417007 Site Selection of CNG Station by Using FUZZY-AHP Model (Case Study: Gas Zone 4, Tehran City Iran)
Authors: Hamidrza Joodaki
Abstract:
The most complex issue in urban land use planning is site selection that needs to assess the verity of elements and factors. Multi Criteria Decision Making (MCDM) methods are the best approach to deal with complex problems. In this paper, combination of the analytical hierarchy process (AHP) model and FUZZY logic was used as MCDM methods to select the best site for gas station in the 4th gas zone of Tehran. The first and the most important step in FUZZY-AHP model is selection of criteria and sub-criteria. Population, accessibility, proximity and natural disasters were considered as the main criteria in this study. After choosing the criteria, they were weighted based on AHP by EXPERT CHOICE software, and FUZZY logic was used to enhance accuracy and to approach the reality. After these steps, criteria layers were produced and weighted based on FUZZY-AHP model in GIS. Finally, through ARC GIS software, the layers were integrated and the 4th gas zone in TEHRAN was selected as the best site to locate gas station.Keywords: multiple criteria decision making (MCDM), analytic hierarchy process (AHP), FUZZY logic, geographic information system (GIS)
Procedia PDF Downloads 3617006 Youth and Conflict in Pakistan: Understanding Causes and Promoting Peace
Authors: Irfan Khan
Abstract:
Both the analytical methods used to understand the phenomena of peacebuilding and the ensuing viewpoints on achieving and sustaining "sustainable peace" are broad and diverse. This new field of study draws from sociology, anthropology, political theory, and political economy, psychology, international relations, and more recently, the development sciences to examine the wide range of 'conflicts' it describes. This paper emphasizes the significance of investigating the causes of juvenile disputes. It explains how police corruption encourages youth crime and why it's so important to address this issue head-on. It also examines the historical foundations and external pressures that have increased religious extremism and sectarian strife in Pakistan. The primary argument is that peace is not only a desirable 'goal' in itself but also that it may be a means to achieve political stability and long-term prosperity. Strategies for constructing peace may take many shapes, each tailored to the specifics of a given conflict, its scope, and the individuals involved. By drawing on some existing literature and applying it to the situation in Pakistan, this article proposes a viewpoint that centers on the participation of young people in the peacebuilding process. Due to their enhanced susceptibility and penchant for demanding change, young people are more likely to get involved in a conflict when economic failure and unemployment are present. The piece also emphasizes the marginalization young people experience as a result of their absence from decision-making processes and the political system. The article claims that Pakistan's rapidly growing young population presents a significant chance for a long-term "demographic dividend" in the form of improvements in peacebuilding processes. This benefit will only materialize if serious steps are taken to increase young people's voice and agency in political decision-making.Keywords: peacebuilding, youth-led initiatives, empowerment, conflict & violence, religious extremism, political involvement, decision-making
Procedia PDF Downloads 697005 Value Index, a Novel Decision Making Approach for Waste Load Allocation
Authors: E. Feizi Ashtiani, S. Jamshidi, M.H Niksokhan, A. Feizi Ashtiani
Abstract:
Waste load allocation (WLA) policies may use multi-objective optimization methods to find the most appropriate and sustainable solutions. These usually intend to simultaneously minimize two criteria, total abatement costs (TC) and environmental violations (EV). If other criteria, such as inequity, need for minimization as well, it requires introducing more binary optimizations through different scenarios. In order to reduce the calculation steps, this study presents value index as an innovative decision making approach. Since the value index contains both the environmental violation and treatment costs, it can be maximized simultaneously with the equity index. It implies that the definition of different scenarios for environmental violations is no longer required. Furthermore, the solution is not necessarily the point with minimized total costs or environmental violations. This idea is testified for Haraz River, in north of Iran. Here, the dissolved oxygen (DO) level of river is simulated by Streeter-Phelps equation in MATLAB software. The WLA is determined for fish farms using multi-objective particle swarm optimization (MOPSO) in two scenarios. At first, the trade-off curves of TC-EV and TC-Inequity are plotted separately as the conventional approach. In the second, the Value-Equity curve is derived. The comparative results show that the solutions are in a similar range of inequity with lower total costs. This is due to the freedom of environmental violation attained in value index. As a result, the conventional approach can well be replaced by the value index particularly for problems optimizing these objectives. This reduces the process to achieve the best solutions and may find better classification for scenario definition. It is also concluded that decision makers are better to focus on value index and weighting its contents to find the most sustainable alternatives based on their requirements.Keywords: waste load allocation (WLA), value index, multi objective particle swarm optimization (MOPSO), Haraz River, equity
Procedia PDF Downloads 4227004 Formal Institutions and Women's Electoral Participation in Four European Countries
Authors: Sophia Francesca D. Lu
Abstract:
This research tried to produce evidence that formal institutions, such as electoral and internal party quotas, can advance women’s active roles in the public sphere using the cases of four European countries: Belgium, Germany, Italy, and the Netherlands. The quantitative dataset was provided by the University of Chicago and the Inter-University Consortium of Political and Social Research based on a two-year study (2008-2010) of political parties. Belgium engages in constitutionally mandated electoral quotas. Germany, Italy and the Netherlands, on the other hand, have internal party quotas, which are voluntarily adopted by political parties. In analyzing each country’s chi-square and Pearson’s r correlation, Belgium, having an electoral quota, is the only country that was analyzed for electoral quotas. Germany, Italy and the Netherlands’ internal voluntary party quotas were correlated with women’s descriptive representations. Using chi-square analysis, this study showed that the presence of electoral quotas is correlated with an increase in the percentage of women in decision-making bodies as well as with an increase in the percentage of women in decision-making bodies. Likewise, using correlational analysis, a higher number of political parties employing internal party voluntary quotas is correlated with an increase in the percentage of women occupying seats in parliament as well as an increase in the percentage of women nominees in electoral lists of political parties. In conclusion, gender quotas, such as electoral quotas or internal party quotas, are an effective policy tool for greater women’s representation in political bodies. Political parties and governments should opt to have gender quotas, whether electoral or internal party quotas, to address the underrepresentation of women in parliament, decision-making bodies, and policy-formulation.Keywords: electoral quota, Europe, formal institutions, institutional feminism, internal party quota, women’s electoral participation
Procedia PDF Downloads 4297003 A Novel Guided Search Based Multi-Objective Evolutionary Algorithm
Authors: A. Baviskar, C. Sandeep, K. Shankar
Abstract:
Solving Multi-objective Optimization Problems requires faster convergence and better spread. Though existing Evolutionary Algorithms (EA's) are able to achieve this, the computation effort can further be reduced by hybridizing them with innovative strategies. This study is focuses on converging to the pareto front faster while adapting the advantages of Strength Pareto Evolutionary Algorithm-II (SPEA-II) for a better spread. Two different approaches based on optimizing the objective functions independently are implemented. In the first method, the decision variables corresponding to the optima of individual objective functions are strategically used to guide the search towards the pareto front. In the second method, boundary points of the pareto front are calculated and their decision variables are seeded to the initial population. Both the methods are applied to different constrained and unconstrained multi-objective test functions. It is observed that proposed guided search based algorithm gives better convergence and diversity than several well-known existing algorithms (such as NSGA-II and SPEA-II) in considerably less number of iterations.Keywords: boundary points, evolutionary algorithms (EA's), guided search, strength pareto evolutionary algorithm-II (SPEA-II)
Procedia PDF Downloads 2777002 Optimization Technique for the Contractor’s Portfolio in the Bidding Process
Authors: Taha Anjamrooz, Sareh Rajabi, Salwa Bheiry
Abstract:
Selection between the available projects in bidding processes for the contractor is one of the essential areas to concentrate on. It is important for the contractor to choose the right projects within its portfolio during the tendering stage based on certain criteria. It should align the bidding process with its origination strategies and goals as a screening process to have the right portfolio pool to start with. Secondly, it should set the proper framework and use a suitable technique in order to optimize its selection process for concertation purpose and higher efforts during the tender stage with goals of success and winning. In this research paper, a two steps framework proposed to increase the efficiency of the contractor’s bidding process and the winning chance of getting the new projects awarded. In this framework, initially, all the projects pass through the first stage screening process, in which the portfolio basket will be evaluated and adjusted in accordance with the organization strategies to the reduced version of the portfolio pool, which is in line with organization activities. In the second stage, the contractor uses linear programming to optimize the portfolio pool based on available resources such as manpower, light equipment, heavy equipment, financial capability, return on investment, and success rate of winning the bid. Therefore, this optimization model will assist the contractor in utilizing its internal resource to its maximum and increase its winning chance for the new project considering past experience with clients, built-relation between two parties, and complexity in the exertion of the projects. The objective of this research will be to increase the contractor's winning chance in the bidding process based on the success rate and expected return on investment.Keywords: bidding process, internal resources, optimization, contracting portfolio management
Procedia PDF Downloads 1427001 Using the Clinical Decision Support Platform, Dem DX, to Assess the ‘Urgent Community Care Team’s Notes Regarding Clinical Assessment, Management, and Healthcare Outcomes
Abstract:
Background: Heywood, Middleton & Rochdale Urgent Community Care Team (UCCT)1 is a great example of using a multidisciplinary team to cope with demand. The service reduces unnecessary admissions to hospitals and ensures that patients can leave the hospital quicker by making care more readily available within the community and patient’s homes. The team comprises nurses, community practitioners, and allied health professions, including physiotherapy, occupational therapy, pharmacy, and GPs. The main challenge for a team with a range of experiences and skill sets is to maintain consistency of care, which technology can help address. Allied healthcare professionals (HCPs) are often used in expanded roles with duties mainly involving patient consultations and decision making to ease pressure on doctors. The Clinical Reasoning Platform (CRP) Dem Dx is used to support new as well as experienced professionals in the decision making process. By guiding HCPs through diagnosing patients from an expansive directory of differential diagnoses, patients can receive quality care in the community. Actions on the platform are determined using NICE guidelines along with local guidance influencing the assessment and management of a patient. Objective: To compare the clinical assessment, decisions, and actions taken by the UCCT multidisciplinary team in the community and Dem Dx, using retrospective clinical cases. Methodology: Dem Dx was used to analyse 192 anonymised cases provided by the HMR UCCT. The team’s performance was compared with Dem Dx regarding the quality of the documentation of the clinical assessment and the next steps on the patient’s journey, including the initial management, actions, and any onward referrals made. The cases were audited by two medical doctors. Results: The study found that the actions outlined by the Dem Dx platform were appropriate in almost 87% of cases. When in a direct comparison between DemDX and the actions taken by the clinical team, it was found that the platform was suitable 83% (p<0.001) of the time and could lead to a potential improvement of 66% in the assessment and management of cases. Dem Dx also served to highlight the importance of comprehensive and high quality clinical documentation. The quality of documentation of cases by UCCT can be improved to provide a detailed account of the assessment and management process. By providing step-by-step guidance and documentation at every stage, Dem Dx may ensure that legal accountability has been fulfilled. Conclusion: With the ever expanding workforce in the NHS, technology has become a key component in driving healthcare outcomes. To improve healthcare provision and clinical reasoning, a decision support platform can be integrated into HCPs’ clinical practice. Potential assistance with clinical assessments, the most appropriate next step and actions in a patient’s care, and improvements in the documentation was highlighted by this retrospective study. A further study has been planned to ascertain the effectiveness of improving outcomes using the clinical reasoning platform within the clinical setting by clinicians.Keywords: allied health professional, assessment, clinical reasoning, clinical records, clinical decision-making, ocumentation
Procedia PDF Downloads 1647000 Feature Weighting Comparison Based on Clustering Centers in the Detection of Diabetic Retinopathy
Authors: Kemal Polat
Abstract:
In this paper, three feature weighting methods have been used to improve the classification performance of diabetic retinopathy (DR). To classify the diabetic retinopathy, features extracted from the output of several retinal image processing algorithms, such as image-level, lesion-specific and anatomical components, have been used and fed them into the classifier algorithms. The dataset used in this study has been taken from University of California, Irvine (UCI) machine learning repository. Feature weighting methods including the fuzzy c-means clustering based feature weighting, subtractive clustering based feature weighting, and Gaussian mixture clustering based feature weighting, have been used and compered with each other in the classification of DR. After feature weighting, five different classifier algorithms comprising multi-layer perceptron (MLP), k- nearest neighbor (k-NN), decision tree, support vector machine (SVM), and Naïve Bayes have been used. The hybrid method based on combination of subtractive clustering based feature weighting and decision tree classifier has been obtained the classification accuracy of 100% in the screening of DR. These results have demonstrated that the proposed hybrid scheme is very promising in the medical data set classification.Keywords: machine learning, data weighting, classification, data mining
Procedia PDF Downloads 3266999 An Agile, Intelligent and Scalable Framework for Global Software Development
Authors: Raja Asad Zaheer, Aisha Tanveer, Hafza Mehreen Fatima
Abstract:
Global Software Development (GSD) is becoming a common norm in software industry, despite of the fact that global distribution of the teams presents special issues for effective communication and coordination of the teams. Now trends are changing and project management for distributed teams is no longer in a limbo. GSD can be effectively established using agile and project managers can use different agile techniques/tools for solving the problems associated with distributed teams. Agile methodologies like scrum and XP have been successfully used with distributed teams. We have employed exploratory research method to analyze different recent studies related to challenges of GSD and their proposed solutions. In our study, we had deep insight in six commonly faced challenges: communication and coordination, temporal differences, cultural differences, knowledge sharing/group awareness, speed and communication tools. We have established that each of these challenges cannot be neglected for distributed teams of any kind. They are interlinked and as an aggregated whole can cause the failure of projects. In this paper we have focused on creating a scalable framework for detecting and overcoming these commonly faced challenges. In the proposed solution, our objective is to suggest agile techniques/tools relevant to a particular problem faced by the organizations related to the management of distributed teams. We focused mainly on scrum and XP techniques/tools because they are widely accepted and used in the industry. Our solution identifies the problem and suggests an appropriate technique/tool to help solve the problem based on globally shared knowledgebase. We can establish a cause and effect relationship using a fishbone diagram based on the inputs provided for issues commonly faced by organizations. Based on the identified cause, suitable tool is suggested, our framework suggests a suitable tool. Hence, a scalable, extensible, self-learning, intelligent framework proposed will help implement and assess GSD to achieve maximum out of it. Globally shared knowledgebase will help new organizations to easily adapt best practices set forth by the practicing organizations.Keywords: agile project management, agile tools/techniques, distributed teams, global software development
Procedia PDF Downloads 3146998 Realizing the Full Potential of Islamic Banking System: Proposed Suitable Legal Framework for Islamic Banking System in Tanzania
Authors: Maulana Ayoub Ali, Pradeep Kulshrestha
Abstract:
Laws of any given secular state have a huge contribution in the growth of the Islamic banking system because the system uses conventional laws to govern its activities. Therefore, the former should be ready to accommodate the latter in order to make the Islamic banking system work properly without affecting the current conventional banking system and therefore without affecting its system. Islamic financial rules have been practiced since the birth of Islam. Following the recent world economic challenges in the financial sector, a quick rebirth of the contemporary Islamic ethical banking system took place. The coming of the Islamic banking system is due to various reasons including but not limited to the failure of the interest based economy in solving financial problems around the globe. Therefore, the Islamic banking system has been adopted as an alternative banking system in order to recover the highly damaged global financial sector. But the Islamic banking system has been facing a number of challenges which hinder its smooth operation in different parts of the world. It has not been the aim of this paper to discuss other challenges rather than the legal ones, but the same was partly discussed when it was justified that it was proper to do so. Generally, there are so many things which have been discovered in the course of writing this paper. The most important part is the issue of the regulatory and supervisory framework for the Islamic banking system in Tanzania and in other nations is considered to be a crucial part for the development of the Islamic banking industry. This paper analyses what has been observed in the study on that area and recommends for necessary actions to be taken on board in a bid to make Islamic banking system reach its climax of serving the larger community by providing ethical, equitable, affordable, interest-free and society cantered banking system around the globe.Keywords: Islamic banking, interest free banking, ethical banking, legal framework
Procedia PDF Downloads 1496997 An Approach to Manage and Evaluate Asset Performance
Authors: Mohammed Saif Al-Saidi, John P. T. Mo
Abstract:
Modern engineering assets are complex and very high in value. They are expected to function for years to come, with ability to handle the change in technology and ageing modification. The aging of an engineering asset and continues increase of vendors and contractors numbers forces the asset operation management (or Owner) to design an asset system which can capture these changes. Furthermore, an accurate performance measurement and risk evaluation processes are highly needed. Therefore, this paper explores the nature of the asset management system performance evaluation for an engineering asset based on the System Support Engineering (SSE) principles. The research work explores the asset support system from a range of perspectives, interviewing managers from across a refinery organisation. The factors contributing to complexity of an asset management system are described in context which clusters them into several key areas. It is proposed that SSE framework may then be used as a tool for analysis and management of asset. The paper will conclude with discussion of potential application of the framework and opportunities for future research.Keywords: asset management, performance, evaluation, modern engineering, System Support Engineering (SSE)
Procedia PDF Downloads 6786996 The Effect of Tacit Knowledge for Intelligence Cycle
Authors: Bahadir Aydin
Abstract:
It is difficult to access accurate knowledge because of mass data. This huge data make environment more and more caotic. Data are main piller of intelligence. The affiliation between intelligence and knowledge is quite significant to understand underlying truths. The data gathered from different sources can be modified, interpreted and classified by using intelligence cycle process. This process is applied in order to progress to wisdom as well as intelligence. Within this process the effect of tacit knowledge is crucial. Knowledge which is classified as explicit and tacit knowledge is the key element for any purpose. Tacit knowledge can be seen as "the tip of the iceberg”. This tacit knowledge accounts for much more than we guess in all intelligence cycle. If the concept of intelligence cycle is scrutinized, it can be seen that it contains risks, threats as well as success. The main purpose of all organizations is to be successful by eliminating risks and threats. Therefore, there is a need to connect or fuse existing information and the processes which can be used to develop it. Thanks to this process the decision-makers can be presented with a clear holistic understanding, as early as possible in the decision making process. Altering from the current traditional reactive approach to a proactive intelligence cycle approach would reduce extensive duplication of work in the organization. Applying new result-oriented cycle and tacit knowledge intelligence can be procured and utilized more effectively and timely.Keywords: information, intelligence cycle, knowledge, tacit Knowledge
Procedia PDF Downloads 5146995 FLEX: A Backdoor Detection and Elimination Method in Federated Scenario
Authors: Shuqi Zhang
Abstract:
Federated learning allows users to participate in collaborative model training without sending data to third-party servers, reducing the risk of user data privacy leakage, and is widely used in smart finance and smart healthcare. However, the distributed architecture design of federation learning itself and the existence of secure aggregation protocols make it inherently vulnerable to backdoor attacks. To solve this problem, the federated learning backdoor defense framework FLEX based on group aggregation, cluster analysis, and neuron pruning is proposed, and inter-compatibility with secure aggregation protocols is achieved. The good performance of FLEX is verified by building a horizontal federated learning framework on the CIFAR-10 dataset for experiments, which achieves 98% success rate of backdoor detection and reduces the success rate of backdoor tasks to 0% ~ 10%.Keywords: federated learning, secure aggregation, backdoor attack, cluster analysis, neuron pruning
Procedia PDF Downloads 966994 Price Effect Estimation of Tobacco on Low-wage Male Smokers: A Causal Mediation Analysis
Authors: Kawsar Ahmed, Hong Wang
Abstract:
The study's goal was to estimate the causal mediation impact of tobacco tax before and after price hikes among low-income male smokers, with a particular emphasis on the effect estimating pathways framework for continuous and dichotomous variables. From July to December 2021, a cross-sectional investigation of observational data (n=739) was collected from Bangladeshi low-wage smokers. The Quasi-Bayesian technique, binomial probit model, and sensitivity analysis using a simulation of the computational tools R mediation package had been used to estimate the effect. After a price rise for tobacco products, the average number of cigarettes or bidis sticks taken decreased from 6.7 to 4.56. Tobacco product rising prices have a direct effect on low-income people's decisions to quit or lessen their daily smoking habits of Average Causal Mediation Effect (ACME) [effect=2.31, 95 % confidence interval (C.I.) = (4.71-0.00), p<0.01], Average Direct Effect (ADE) [effect=8.6, 95 percent (C.I.) = (6.8-0.11), p<0.001], and overall significant effects (p<0.001). Tobacco smoking choice is described by the mediated proportion of income effect, which is 26.1% less of following price rise. The curve of ACME and ADE is based on observational figures of the coefficients of determination that asses the model of hypothesis as the substantial consequence after price rises in the sensitivity analysis. To reduce smoking product behaviors, price increases through taxation have a positive causal mediation with income that affects the decision to limit tobacco use and promote low-income men's healthcare policy.Keywords: causal mediation analysis, directed acyclic graphs, tobacco price policy, sensitivity analysis, pathway estimation
Procedia PDF Downloads 1126993 Syndromic Surveillance Framework Using Tweets Data Analytics
Authors: David Ming Liu, Benjamin Hirsch, Bashir Aden
Abstract:
Syndromic surveillance is to detect or predict disease outbreaks through the analysis of medical sources of data. Using social media data like tweets to do syndromic surveillance becomes more and more popular with the aid of open platform to collect data and the advantage of microblogging text and mobile geographic location features. In this paper, a Syndromic Surveillance Framework is presented with machine learning kernel using tweets data analytics. Influenza and the three cities Abu Dhabi, Al Ain and Dubai of United Arabic Emirates are used as the test disease and trial areas. Hospital cases data provided by the Health Authority of Abu Dhabi (HAAD) are used for the correlation purpose. In our model, Latent Dirichlet allocation (LDA) engine is adapted to do supervised learning classification and N-Fold cross validation confusion matrix are given as the simulation results with overall system recall 85.595% performance achieved.Keywords: Syndromic surveillance, Tweets, Machine Learning, data mining, Latent Dirichlet allocation (LDA), Influenza
Procedia PDF Downloads 1166992 Determinant Factor of Farm Household Fruit Tree Planting: The Case of Habru Woreda, North Wollo
Authors: Getamesay Kassaye Dimru
Abstract:
The cultivation of fruit tree in degraded areas has two-fold importance. Firstly, it improves food availability and income, and secondly, it promotes the conservation of soil and water improving, in turn, the productivity of the land. The main objectives of this study are to identify the determinant of farmer's fruit trees plantation decision and to major fruit production challenges and opportunities of the study area. The analysis was made using primary data collected from 60 sample household selected randomly from the study area in 2016. The primary data was supplemented by data collected from a key informant. In addition to the descriptive statistics and statistical tests (Chi-square test and t-test), a logit model was employed to identify the determinant of fruit tree plantation decision. Drought, pest incidence, land degradation, lack of input, lack of capital and irrigation schemes maintenance, lack of misuse of irrigation water and limited agricultural personnel are the major production constraints identified. The opportunities that need to further exploited are better access to irrigation, main road access, endowment of preferred guava variety, experience of farmers, and proximity of the study area to research center. The result of logit model shows that from different factors hypothesized to determine fruit tree plantation decision, age of the household head accesses to market and perception of farmers about fruits' disease and pest resistance are found to be significant. The result has revealed important implications for the promotion of fruit production for both land degradation control and rehabilitation and increasing the livelihood of farming households.Keywords: degradation, fruit, irrigation, pest
Procedia PDF Downloads 2356991 Predication Model for Leukemia Diseases Based on Data Mining Classification Algorithms with Best Accuracy
Authors: Fahd Sabry Esmail, M. Badr Senousy, Mohamed Ragaie
Abstract:
In recent years, there has been an explosion in the rate of using technology that help discovering the diseases. For example, DNA microarrays allow us for the first time to obtain a "global" view of the cell. It has great potential to provide accurate medical diagnosis, to help in finding the right treatment and cure for many diseases. Various classification algorithms can be applied on such micro-array datasets to devise methods that can predict the occurrence of Leukemia disease. In this study, we compared the classification accuracy and response time among eleven decision tree methods and six rule classifier methods using five performance criteria. The experiment results show that the performance of Random Tree is producing better result. Also it takes lowest time to build model in tree classifier. The classification rules algorithms such as nearest- neighbor-like algorithm (NNge) is the best algorithm due to the high accuracy and it takes lowest time to build model in classification.Keywords: data mining, classification techniques, decision tree, classification rule, leukemia diseases, microarray data
Procedia PDF Downloads 3206990 Project Knowledge Harvesting: The Case of Improving Project Performance through Project Knowledge Sharing Framework
Authors: Eng Rima Al-Awadhi, Abdul Jaleel Tharayil
Abstract:
In a project-centric organization like KOC, managing the knowledge of the project is of critical importance to the success of the project and the organization. However, due to the very nature and complexity involved, each project engagement generates a lot of 'learnings' that need to be factored into while new projects are initiated and thus avoid repeating the same mistake. But, many a time these learnings are localized and remains as ‘tacit knowledge’ leading to scope re-work, schedule overrun, adjustment orders, concession requests and claims. While KOC follows an asset based organization structure, with a multi-cultural and multi-ethnic workforce and larger chunk of the work is carried out through complex, long term project engagement, diffusion of ‘learnings’ across assets while dealing with the natural entropy of the organization is of great significance. Considering the relatively higher number of mega projects, it's important that the issues raised during the project life cycle are centrally harvested, analyzed and the ‘learnings’ from these issues are shared, absorbed and are in-turn utilized to enhance and refine the existing process and practices, leading to improve the project performance. One of the many factors contributing to the successful completion of a project on time is the reduction in the number of variations or concessions triggered during the project life cycle. The project process integrated knowledge sharing framework discusses the knowledge harvesting methodology adopted, the challenges faced, learnings acquired and its impact on project performance. The framework facilitates the proactive identification of issues that may have an impact on the overall quality of the project and improve performance.Keywords: knowledge harvesting, project integrated knowledge sharing, performance improvement, knowledge management, lessons learn
Procedia PDF Downloads 3966989 Corporate In-Kind Donations and Economic Efficiency: The Case of Surplus Food Recovery and Donation
Authors: Sedef Sert, Paola Garrone, Marco Melacini, Alessandro Perego
Abstract:
This paper is aimed at enhancing our current understanding of motivations behind corporate in-kind donations and to find out whether economic efficiency may be a major driver. Our empirical setting is consisted of surplus food recovery and donation by companies from food supply chain. This choice of empirical setting is motivated by growing attention on the paradox of food insecurity and food waste i.e. a total of 842 million people worldwide were estimated to be suffering from regularly not getting enough food, while approximately 1.3 billion tons per year food is wasted globally. Recently, many authors have started considering surplus food donation to nonprofit organizations as a way to cope with social issue of food insecurity and environmental issue of food waste. In corporate philanthropy literature the motivations behind the corporate donations for social purposes, such as altruistic motivations, enhancements to employee morale, the organization’s image, supplier/customer relationships, local community support, have been examined. However, the relationship with economic efficiency is not studied and in many cases the pure economic efficiency as a decision making factor is neglected. Although in literature there are some studies give us the clue on economic value creation of surplus food donation such as saving landfill fees or getting tax deductions, so far there is no study focusing deeply on this phenomenon. In this paper, we develop a conceptual framework which explores the economic barriers and drivers towards alternative surplus food management options i.e. discounts, secondary markets, feeding animals, composting, energy recovery, disposal. The case study methodology is used to conduct the research. Protocols for semi structured interviews are prepared based on an extensive literature review and adapted after expert opinions. The interviews are conducted mostly with the supply chain and logistics managers of 20 companies in food sector operating in Italy, in particular in Lombardy region. The results shows that in current situation, the food manufacturing companies can experience cost saving by recovering and donating the surplus food with respect to other methods especially considering the disposal option. On the other hand, retail and food service sectors are not economically incentivized to recover and donate surplus food to disfavored population. The paper shows that not only strategic and moral motivations, but also economic motivations play an important role in managerial decision making process in surplus food management. We also believe that our research while rooted in the surplus food management topic delivers some interesting implications to more general research on corporate in-kind donations. It also shows that there is a huge room for policy making favoring the recovery and donation of surplus products.Keywords: corporate philanthropy, donation, recovery, surplus food
Procedia PDF Downloads 3126988 Planning of Construction Material Flow Using Hybrid Simulation Modeling
Authors: A. M. Naraghi, V. Gonzalez, M. O'Sullivan, C. G. Walker, M. Poshdar, F. Ying, M. Abdelmegid
Abstract:
Discrete Event Simulation (DES) and Agent Based Simulation (ABS) are two simulation approaches that have been proposed to support decision-making in the construction industry. Despite the wide use of these simulation approaches in the construction field, their applications for production and material planning is still limited. This is largely due to the dynamic and complex nature of construction material supply chain systems. Moreover, managing the flow of construction material is not well integrated with site logistics in traditional construction planning methods. This paper presents a hybrid of DES and ABS to simulate on-site and off-site material supply processes. DES is applied to determine the best production scenarios with information of on-site production systems, while ABS is used to optimize the supply chain network. A case study of a construction piling project in New Zealand is presented illustrating the potential benefits of using the proposed hybrid simulation model in construction material flow planning. The hybrid model presented can be used to evaluate the impact of different decisions on construction supply chain management.Keywords: construction supply-chain management, simulation modeling, decision-support tools, hybrid simulation
Procedia PDF Downloads 2076987 Providing a Proposed Framework for the Copyright of Library Resources in Iran: A Comparative Study of the Copyright Laws of Iran, Australia and U.S.
Authors: Zeinab Papi
Abstract:
This study was aimed at analyzing the copyright laws of Iran, Australia, the U.S., and library portals, thereby providing a proposed framework for the copyright of library resources for the NLAI and other Iranian libraries while considering the current situation and the internal Iranian laws. This is an applied study falling in the category of qualitative approach research. Documentary analysis method and comparative method were used to resolve the problem and answer the questions of the research. The two National Library of Australia (NLA) and Library of Congress (LC), together with the NLAI formed the research community. In addition, the Iranian Law for the Protection of Authors, Composers and Artists Rights (1970); the Australian Copyright Act (1968), and the U.S. Copyright Law (1976) were purposefully selected as three main resources among other documents and resources. Findings revealed that the dimensions of fair and non-profit use, duration of copyright, license, and agreement, copyright policy, moral rights, economic rights, and infringement of copyright were the main dimensions that, along with 49 main components, formed the proposed framework for the copyright of information resources for the NLAI and other Iranian libraries. It should be acknowledged that there are some differences in different copyright fields between countries' laws, and each country takes into account its internal conditions to compile and revise the laws. By following the laws of other countries, it is possible to effectively improve and develop copyright laws. The researcher hopes that this research can have its effects in creating awareness and ability among librarians, formulating a copyright policy in Iranian libraries, and helping legislators in revising copyright laws regarding library exceptions and exemptions.Keywords: copyright, library resources, National Library and Archives of the I.R. of Iran, National Library of Australia, Library of Congress, copyright law
Procedia PDF Downloads 756986 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach
Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta
Abstract:
Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.Keywords: support vector machines, decision tree, random forest
Procedia PDF Downloads 406985 A Greedy Alignment Algorithm Supporting Medication Reconciliation
Authors: David Tresner-Kirsch
Abstract:
Reconciling patient medication lists from multiple sources is a critical task supporting the safe delivery of patient care. Manual reconciliation is a time-consuming and error-prone process, and recently attempts have been made to develop efficiency- and safety-oriented automated support for professionals performing the task. An important capability of any such support system is automated alignment – finding which medications from a list correspond to which medications from a different source, regardless of misspellings, naming differences (e.g. brand name vs. generic), or changes in treatment (e.g. switching a patient from one antidepressant class to another). This work describes a new algorithmic solution to this alignment task, using a greedy matching approach based on string similarity, edit distances, concept extraction and normalization, and synonym search derived from the RxNorm nomenclature. The accuracy of this algorithm was evaluated against a gold-standard corpus of 681 medication records; this evaluation found that the algorithm predicted alignments with 99% precision and 91% recall. This performance is sufficient to support decision support applications for medication reconciliation.Keywords: clinical decision support, medication reconciliation, natural language processing, RxNorm
Procedia PDF Downloads 285