Search results for: classification framework
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7021

Search results for: classification framework

5731 Too Well to Die; Too Ill to Live

Authors: Deepak Jugran

Abstract:

The last century has witnessed rapid scientific growth, and social policies mainly targeted to increase the “life expectancy” of the people. As a result of these developments, the aging as well as ailing population, is increasing by every day. Despite an increase in “life expectancy”, we have not recorded compression in morbidity numbers as the age of onset of the majority of health issues has not increased substantially. In recent years, the prevalence of chronic diseases along with the improved treatment has also resulted in the increase of people living with chronic diseases. The last decade has also focused on social policies to increase the life expectancy in the population; however, in recent decades, social policies and biomedical research are gradually shifting on the potential of increasing healthy life or healthspan. In this article, we review the existing framework of lifespan and healthspan and wish to ignite a discussion among social scientists and public health experts to propose a wholistic framework to balance the trade-offs on social policies for “lifespan” and “healthspan”.

Keywords: lifespan, healthspan, chronic diseases, social policies

Procedia PDF Downloads 105
5730 Navigating Uncertainties in Project Control: A Predictive Tracking Framework

Authors: Byung Cheol Kim

Abstract:

This study explores a method for the signal-noise separation challenge in project control, focusing on the limitations of traditional deterministic approaches that use single-point performance metrics to predict project outcomes. We detail how traditional methods often overlook future uncertainties, resulting in tracking biases when reliance is placed solely on immediate data without adjustments for predictive accuracy. Our investigation led to the development of the Predictive Tracking Project Control (PTPC) framework, which incorporates network simulation and Bayesian control models to adapt more effectively to project dynamics. The PTPC introduces controlled disturbances to better identify and separate tracking biases from useful predictive signals. We will demonstrate the efficacy of the PTPC with examples, highlighting its potential to enhance real-time project monitoring and decision-making, marking a significant shift towards more accurate project management practices.

Keywords: predictive tracking, project control, signal-noise separation, Bayesian inference

Procedia PDF Downloads 9
5729 Simulation Aided Life Cycle Sustainability Assessment Framework for Manufacturing Design and Management

Authors: Mijoh A. Gbededo, Kapila Liyanage, Ilias Oraifige

Abstract:

Decision making for sustainable manufacturing design and management requires critical considerations due to the complexity and partly conflicting issues of economic, social and environmental factors. Although there are tools capable of assessing the combination of one or two of the sustainability factors, the frameworks have not adequately integrated all the three factors. Case study and review of existing simulation applications also shows the approach lacks integration of the sustainability factors. In this paper we discussed the development of a simulation based framework for support of a holistic assessment of sustainable manufacturing design and management. To achieve this, a strategic approach is introduced to investigate the strengths and weaknesses of the existing decision supporting tools. Investigation reveals that Discrete Event Simulation (DES) can serve as a rock base for other Life Cycle Analysis frameworks. Simio-DES application optimizes systems for both economic and competitive advantage, Granta CES EduPack and SimaPro collate data for Material Flow Analysis and environmental Life Cycle Assessment, while social and stakeholders’ analysis is supported by Analytical Hierarchy Process, a Multi-Criteria Decision Analysis method. Such a common and integrated framework creates a platform for companies to build a computer simulation model of a real system and assess the impact of alternative solutions before implementing a chosen solution.

Keywords: discrete event simulation, life cycle sustainability analysis, manufacturing, sustainability

Procedia PDF Downloads 275
5728 A Secure System for Handling Information from Heterogeous Sources

Authors: Shoohira Aftab, Hammad Afzal

Abstract:

Information integration is a well known procedure to provide consolidated view on sets of heterogeneous information sources. It not only provides better statistical analysis of information but also facilitates users to query without any knowledge on the underlying heterogeneous information sources The problem of providing a consolidated view of information can be handled using Semantic data (information stored in such a way that is understandable by machines and integrate-able without manual human intervention). However, integrating information using semantic web technology without any access management enforced, will results in increase of privacy and confidentiality concerns. In this research we have designed and developed a framework that would allow information from heterogeneous formats to be consolidated, thus resolving the issue of interoperability. We have also devised an access control system for defining explicit privacy constraints. We designed and applied our framework on both semantic and non-semantic data from heterogeneous resources. Our approach is validated using scenario based testing.

Keywords: information integration, semantic data, interoperability, security, access control system

Procedia PDF Downloads 348
5727 Rangeland Monitoring by Computerized Technologies

Authors: H. Arzani, Z. Arzani

Abstract:

Every piece of rangeland has a different set of physical and biological characteristics. This requires the manager to synthesis various information for regular monitoring to define changes trend to get wright decision for sustainable management. So range managers need to use computerized technologies to monitor rangeland, and select. The best management practices. There are four examples of computerized technologies that can benefit sustainable management: (1) Photographic method for cover measurement: The method was tested in different vegetation communities in semi humid and arid regions. Interpretation of pictures of quadrats was done using Arc View software. Data analysis was done by SPSS software using paired t test. Based on the results, generally, photographic method can be used to measure ground cover in most vegetation communities. (2) GPS application for corresponding ground samples and satellite pixels: In two provinces of Tehran and Markazi, six reference points were selected and in each point, eight GPS models were tested. Significant relation among GPS model, time and location with accuracy of estimated coordinates was found. After selection of suitable method, in Markazi province coordinates of plots along four transects in each 6 sites of rangelands was recorded. The best time of GPS application was in the morning hours, Etrex Vista had less error than other models, and a significant relation among GPS model, time and location with accuracy of estimated coordinates was found. (3) Application of satellite data for rangeland monitoring: Focusing on the long term variation of vegetation parameters such as vegetation cover and production is essential. Our study in grass and shrub lands showed that there were significant correlations between quantitative vegetation characteristics and satellite data. So it is possible to monitor rangeland vegetation using digital data for sustainable utilization. (4) Rangeland suitability classification with GIS: Range suitability assessment can facilitate sustainable management planning. Three sub-models of sensitivity to erosion, water suitability and forage production out puts were entered to final range suitability classification model. GIS was facilitate classification of range suitability and produced suitability maps for sheep grazing. Generally digital computers assist range managers to interpret, modify, calibrate or integrating information for correct management.

Keywords: computer, GPS, GIS, remote sensing, photographic method, monitoring, rangeland ecosystem, management, suitability, sheep grazing

Procedia PDF Downloads 364
5726 Ethical Framework in Organ Transplantation and the Priority Line between Law and Life

Authors: Abel Sichinava

Abstract:

The need for organ transplantation is vigorously increasing worldwide. The numbers on the waiting lists grow, but the number of donors is not keeping up with the demand even though there is a legal possibility of decreasing the gap between the demand and supply. Most countries around the globe are facing an organ donation problem (living or deceased); however, the extent of the problem differs based on how well developed a country is. The determining issues seem to be centered on how aware the society is about the concept of organ donation, as well as cultural and religious factors. Even if people are aware of the benefits of organ donation, they may still have fears that keep them from being in complete agreement with the idea. Some believe that in the case of deceased organ donation: “the brain dead human body may recover from its injuries” or “the sick might get less appropriate treatment if doctors know they are potential donors.” In the case of living organ donations, people sometimes fear that after the donation, “it might reduce work efficiency, cause health deterioration or even death.” Another major obstacle in the organ shortage is a lack of a well developed ethical framework. In reality, there are truly an immense number of people on the waiting list, and they have only two options in order to receive a suitable organ. First is the legal way, which is to wait until their turn. Sadly, numerous patients die while on the waiting list before an appropriate organ becomes available for transplant. The second option is an illegal way: seeking an organ in a country where they can possibly get. To tell the truth, in people’s desire to live, they may choose the second option if their resources are sufficient. This process automatically involves “organ brokers.” These are people who get organs from vulnerable poor people by force or betrayal. As mentioned earlier, the high demand and low supply leads to human trafficking. The subject of the study was the large number of society from different backgrounds of their belief, culture, nationality, level of education, socio-economic status. The great majority of them interviewed online used “Google Drive Survey” and others in person. All statistics and information gathered from trusted sources annotated in the reference list and above mentioned considerable testimonies shared by the respondents are the fundamental evidence of a lack of the well developed ethical framework. In conclusion, the continuously increasing number of people on the waiting list and an irrelevant ethical framework, lead people to commit to atrocious, dehumanizing crimes. Therefore, world society should be equally obligated to think carefully and make vital decisions together for the advancement of an organ donations and its ethical framework.

Keywords: donation, ethical framwork, organ, transplant

Procedia PDF Downloads 147
5725 A Framework for Evaluating the QoS and Cost of Web Services Based on Its Functional Performance

Authors: M. Mohemmed Sha, T. Manesh, A. Ahmed Mohamed Mustaq

Abstract:

In this corporate world, the technology of Web services has grown rapidly and its significance for the development of web based applications gradually rises over time. The success of Business to Business integration rely on finding novel partners and their services in a global business environment. But the selection of the most suitable Web service from the list of services with the identical functionality is more vital. The satisfaction level of the customer and the provider’s reputation of the Web service are primarily depending on the range it reaches the customer’s requirements. In most cases the customer of the Web service feels that he is spending for the service which is undelivered. This is because the customer always thinks that the real functionality of the web service is not reached. This will lead to change of the service frequently. In this paper, a framework is proposed to evaluate the Quality of Service (QoS) and its cost that makes the optimal correlation between each other. Also this research work proposes some management decision against the functional deviancy of the web service that are guaranteed at time of selection.

Keywords: web service, service level agreement, quality of a service, cost of a service, QoS, CoS, SOA, WSLA, WsRF

Procedia PDF Downloads 413
5724 Stabilization of Spent Engine Oil Contaminated Lateritic Soil Admixed with Cement Kiln Dust for Use as Road Construction Materials

Authors: Johnson Rotimi Oluremi, A. Adedayo Adegbola, A. Samson Adediran, O. Solomon Oladapo

Abstract:

Spent engine oil contains heavy metals and polycyclic aromatic hydrocarbons which contribute to chronic health hazards, poor soil aeration, immobilisation of nutrients and lowering of pH in soil. It affects geotechnical properties of lateritic soil thereby constituting geotechnical and foundation problems. This study is therefore based on the stabilization of spent engine oil (SEO) contaminated lateritic soil using cement kiln dust (CKD) as a mean of restoring it to its pristine state. Geotechnical tests which include sieve analysis, atterberg limit, compaction, California bearing ratio and unconfined compressive strength tests were carried out on the natural, SEO contaminated and CKD stabilized SEO contaminated lateritic soil samples. The natural soil classified as A-2-7 (2) by AASHTO classification and GC according to the Unified Soil Classification System changed to A-4 non-plastic soil due to SEO contaminated even under the influence of CKD it remained unchanged. However, the maximum dry density (MDD) of the SEO contaminated soil increased while the optimum moisture content (OMC) behaved vice versa with the increase in the percentages of CKD. Similarly, the bearing strength of the stabilized SEO contaminated soil measured by California Bearing Ratio (CBR) increased with percentage increment in CKD. In conclusion, spent engine oil has a detrimental effect on the geotechnical properties of the lateritic soil sample but which can be remediated using 10% CKD as a stand alone admixture in stabilizing spent engine oil contaminated soil.

Keywords: spent engine oil, lateritic soil, cement kiln dust, stabilization, compaction, unconfined compressive strength

Procedia PDF Downloads 386
5723 Scattered Places in Stories Singularity and Pattern in Geographic Information

Authors: I. Pina, M. Painho

Abstract:

Increased knowledge about the nature of place and the conditions under which space becomes place is a key factor for better urban planning and place-making. Although there is a broad consensus on the relevance of this knowledge, difficulties remain in relating the theoretical framework about place and urban management. Issues related to representation of places are among the greatest obstacles to overcome this gap. With this critical discussion, based on literature review, we intended to explore, in a common framework for geographical analysis, the potential of stories to spell out place meanings, bringing together qualitative text analysis and text mining in order to capture and represent the singularity contained in each person's life history, and the patterns of social processes that shape places. The development of this reasoning is based on the extensive geographical thought about place, and in the theoretical advances in the field of Geographic Information Science (GISc).

Keywords: discourse analysis, geographic information science place, place-making, stories

Procedia PDF Downloads 190
5722 Data-driven Decision-Making in Digital Entrepreneurship

Authors: Abeba Nigussie Turi, Xiangming Samuel Li

Abstract:

Data-driven business models are more typical for established businesses than early-stage startups that strive to penetrate a market. This paper provided an extensive discussion on the principles of data analytics for early-stage digital entrepreneurial businesses. Here, we developed data-driven decision-making (DDDM) framework that applies to startups prone to multifaceted barriers in the form of poor data access, technical and financial constraints, to state some. The startup DDDM framework proposed in this paper is novel in its form encompassing startup data analytics enablers and metrics aligning with startups' business models ranging from customer-centric product development to servitization which is the future of modern digital entrepreneurship.

Keywords: startup data analytics, data-driven decision-making, data acquisition, data generation, digital entrepreneurship

Procedia PDF Downloads 322
5721 [Keynote Talk]: sEMG Interface Design for Locomotion Identification

Authors: Rohit Gupta, Ravinder Agarwal

Abstract:

Surface electromyographic (sEMG) signal has the potential to identify the human activities and intention. This potential is further exploited to control the artificial limbs using the sEMG signal from residual limbs of amputees. The paper deals with the development of multichannel cost efficient sEMG signal interface for research application, along with evaluation of proposed class dependent statistical approach of the feature selection method. The sEMG signal acquisition interface was developed using ADS1298 of Texas Instruments, which is a front-end interface integrated circuit for ECG application. Further, the sEMG signal is recorded from two lower limb muscles for three locomotions namely: Plane Walk (PW), Stair Ascending (SA), Stair Descending (SD). A class dependent statistical approach is proposed for feature selection and also its performance is compared with 12 preexisting feature vectors. To make the study more extensive, performance of five different types of classifiers are compared. The outcome of the current piece of work proves the suitability of the proposed feature selection algorithm for locomotion recognition, as compared to other existing feature vectors. The SVM Classifier is found as the outperformed classifier among compared classifiers with an average recognition accuracy of 97.40%. Feature vector selection emerges as the most dominant factor affecting the classification performance as it holds 51.51% of the total variance in classification accuracy. The results demonstrate the potentials of the developed sEMG signal acquisition interface along with the proposed feature selection algorithm.

Keywords: classifiers, feature selection, locomotion, sEMG

Procedia PDF Downloads 290
5720 Breaking the Barrier of Service Hostility: A Lean Approach to Achieve Operational Excellence

Authors: Mofizul Islam Awwal

Abstract:

Due to globalization, industries are rapidly growing throughout the world which leads to many manufacturing organizations. But recently, service industries are beginning to emerge in large numbers almost in all parts of the world including some developing countries. In this context, organizations need to have strong competitive advantage over their rivals to achieve their strategic business goals. Manufacturing industries are adopting many methods and techniques in order to achieve such competitive edge. Over the last decades, manufacturing industries have been successfully practicing lean concept to optimize their production lines. Due to its huge success in manufacturing context, lean has made its way into the service industry. Very little importance has been addressed to service in the area of operations management. Service industries are far behind than manufacturing industries in terms of operations improvement. It will be a hectic job to transfer the lean concept from production floor to service back/front office which will obviously yield possible improvement. Service processes are not as visible as production processes and can be very complex. Lack of research in this area made it quite difficult for service industries as there are no standardized frameworks for successfully implementing lean concept in service organization. The purpose of this research paper is to capture the present scenario of service industry in terms of lean implementation. Thorough analysis of past literature will be done on the applicability and understanding of lean in service structure. Classification of research papers will be done and critical factors will be unveiled for implementing lean in service industry to achieve operational excellence.

Keywords: lean service, lean literature classification, lean implementation, service industry, service excellence

Procedia PDF Downloads 372
5719 Impacts of Aquaculture Farms on the Mangroves Forests of Sundarbans, India (2010-2018): Temporal Changes of NDVI

Authors: Sandeep Thakur, Ismail Mondal, Phani Bhusan Ghosh, Papita Das, Tarun Kumar De

Abstract:

Sundarbans Reserve forest of India has been undergoing major transformations in the recent past owing to population pressure and related changes. This has brought about major changes in the spatial landscape of the region especially in the western parts. This study attempts to assess the impacts of the Landcover changes on the mangrove habitats. Time series imageries of Landsat were used to analyze the Normalized Differential Vegetation Index (NDVI) patterns over the western parts of Indian Sundarbans forest in order to assess the heath of the mangroves in the region. The images were subjected to Land use Land cover (LULC) classification using sub-pixel classification techniques in ERDAS Imagine software and the changes were mapped. The spatial proliferation of aquaculture farms during the study period was also mapped. A multivariate regression analysis was carried out between the obtained NDVI values and the LULC classes. Similarly, the observed meteorological data sets (time series rainfall and minimum and maximum temperature) were also statistically correlated for regression. The study demonstrated the application of NDVI in assessing the environmental status of mangroves as the relationship between the changes in the environmental variables and the remote sensing based indices felicitate an efficient evaluation of environmental variables, which can be used in the coastal zone monitoring and development processes.

Keywords: aquaculture farms, LULC, Mangrove, NDVI

Procedia PDF Downloads 176
5718 Towards a Mandatory Frame of ADR in Divorce Cases: Key Elements from a Comparative Perspective for Belgium

Authors: Celine Jaspers

Abstract:

The Belgian legal system is slowly evolving to mandatory mediation to promote ADR. One of the reasons for this evolution is the lack of use of alternative methods in relation to their possible benefits. Especially in divorce cases, ADR can play a beneficial role in resolving disputes, since the emotional component is very much present. When children are involved, a solution provided by the parent may be more adapted to the child’s best interest than a court order. In the first part, the lack of use of voluntary ADR and the evolution toward mandatory ADR in Belgium will be indicated by sources of legislation, jurisprudence and social-scientific sources, with special attention to divorce cases. One of the reasons is lack of knowledge on ADR, despite the continuing efforts of the Belgian legislator to promote ADR. One of the last acts of ADR-promotion, was the implementation of an Act in 2018 which gives the judge the possibility to refer parties to mediation if at least one party wants to during the judicial procedure. This referral is subject to some conditions. The parties will be sent to a private mediator, recognized by the Federal Mediation Commission, to try to resolve their conflict. This means that at least one party can be mandated to try mediation (indicated as “semi-mandatory mediation”). The main goal is to establish the factors and elements that Belgium has to take into account in their further development of mandatory ADR, with consideration of the human rights perspective and the EU perspective. Furthermore it is also essential to detect some dangerous pitfalls other systems have encountered with their process design. Therefore, the second part, the comparative component, will discuss the existing framework in California, USA to establish the necessary elements, possible pitfalls and considerations the Belgian legislator can take into account when further developing the framework of mandatory ADR. The contrasting and functional method will be used to create key elements and possible pitfalls, to help Belgium improve its existing framework. The existing mandatory system in California has been in place since 1981 and is still up and running, and can thus provide valuable lessons and considerations for the Belgian system. Thirdly, the key elements from a human rights perspective and from a European Union perspective (e.g. the right to access to a judge, the right to privacy) will be discussed too, since the basic human rights and European legislation and jurisprudence play a significant part in Belgian legislation as well. The main sources for this part will be the international and European treaties, legislation, jurisprudence and soft law. In the last and concluding part, the paper will list the most important elements of a mandatory ADR-system design with special attention to the dangers of these elements (e.g. to include or exclude domestic violence cases in the mandatory ADR-framework and the consequences thereof), and with special attention for the necessary the international and European rights, prohibitions and guidelines.

Keywords: Belgium, divorce, framework, mandatory ADR

Procedia PDF Downloads 149
5717 A Qualitative Study Exploring Factors Influencing the Uptake of and Engagement with Health and Wellbeing Smartphone Apps

Authors: D. Szinay, O. Perski, A. Jones, T. Chadborn, J. Brown, F. Naughton

Abstract:

Background: The uptake of health and wellbeing smartphone apps is largely influenced by popularity indicators (e.g., rankings), rather than evidence-based content. Rapid disengagement is common. This study aims to explore how and why potential users 1) select and 2) engage with such apps, and 3) how increased engagement could be promoted. Methods: Semi-structured interviews and a think-aloud approach were used to allow participants to verbalise their thoughts whilst searching for a health or wellbeing app online, followed by a guided search in the UK National Health Service (NHS) 'Apps Library' and Public Health England’s (PHE) 'One You' website. Recruitment took place between June and August 2019. Adults interested in using an app for behaviour change were recruited through social media. Data were analysed using the framework approach. The analysis is both inductive and deductive, with the coding framework being informed by the Theoretical Domains Framework. The results are further mapped onto the COM-B (Capability, Opportunity, Motivation - Behaviour) model. The study protocol is registered on the Open Science Framework (https://osf.io/jrkd3/). Results: The following targets were identified as playing a key role in increasing the uptake of and engagement with health and wellbeing apps: 1) psychological capability (e.g., reduced cognitive load); 2) physical opportunity (e.g., low financial cost); 3) social opportunity (e.g., embedded social media); 4) automatic motivation (e.g., positive feedback). Participants believed that the promotion of evidence-based apps on NHS-related websites could be enhanced through active promotion on social media, adverts on the internet, and in general practitioner practices. Future Implications: These results can inform the development of interventions aiming to promote the uptake of and engagement with evidence-based health and wellbeing apps, a priority within the UK NHS Long Term Plan ('digital first'). The targets identified across the COM-B domains could help organisations that provide platforms for such apps to increase impact through better selection of apps.

Keywords: behaviour change, COM-B model, digital health, mhealth

Procedia PDF Downloads 161
5716 Management of Intellectual Property Rights: Strategic Patenting

Authors: Waheed Oseni

Abstract:

This article reviews emergent global trends in intellectual property protection and identifies patenting as a strategic initiative. Recent developments in software and method of doing business patenting are fast transforming the e‐business landscape. The article discusses the emergent global regulatory framework concerning intellectual property rights and the strategic value of patenting. Important features of a corporate patenting portfolio are described. Superficially, the e‐commerce landscape appears to be dominated by dotcom start-ups or the “dotcomization” of existing brick and mortar companies. But, in reality, at its very bedrock is intellectual property (IP). In this connection, the recent avalanche of patenting of software and method‐of‐doing‐business (MDB) in the USA is a very significant development with regard to rules governing IP rights and, therefore, e‐commerce. Together with the World Trade Organization’s (WTO) IP rules, there is an emerging global regulatory framework for IP rights, an understanding of which is necessary for designing effective e‐commerce strategies.

Keywords: intellectual property, patents, methods, computer software

Procedia PDF Downloads 523
5715 A Robust Spatial Feature Extraction Method for Facial Expression Recognition

Authors: H. G. C. P. Dinesh, G. Tharshini, M. P. B. Ekanayake, G. M. R. I. Godaliyadda

Abstract:

This paper presents a new spatial feature extraction method based on principle component analysis (PCA) and Fisher Discernment Analysis (FDA) for facial expression recognition. It not only extracts reliable features for classification, but also reduces the feature space dimensions of pattern samples. In this method, first each gray scale image is considered in its entirety as the measurement matrix. Then, principle components (PCs) of row vectors of this matrix and variance of these row vectors along PCs are estimated. Therefore, this method would ensure the preservation of spatial information of the facial image. Afterwards, by incorporating the spectral information of the eigen-filters derived from the PCs, a feature vector was constructed, for a given image. Finally, FDA was used to define a set of basis in a reduced dimension subspace such that the optimal clustering is achieved. The method of FDA defines an inter-class scatter matrix and intra-class scatter matrix to enhance the compactness of each cluster while maximizing the distance between cluster marginal points. In order to matching the test image with the training set, a cosine similarity based Bayesian classification was used. The proposed method was tested on the Cohn-Kanade database and JAFFE database. It was observed that the proposed method which incorporates spatial information to construct an optimal feature space outperforms the standard PCA and FDA based methods.

Keywords: facial expression recognition, principle component analysis (PCA), fisher discernment analysis (FDA), eigen-filter, cosine similarity, bayesian classifier, f-measure

Procedia PDF Downloads 423
5714 Teachers' Technological Pedagogical and Content Knowledge and Technology Integration in Teaching and Learning in a Small Island Developing State: A Concept Paper

Authors: Aminath Waseela, Vinesh Chandra, Shaun Nykvist,

Abstract:

The success of technology integration initiatives hinges on the knowledge and skills of teachers to effectively integrate technology in classroom teaching. Consequently, gaining an understanding of teachers' technology knowledge and its integration can provide useful insights on strategies that can be adopted to enhance teaching and learning, especially in developing country contexts where research is scant. This paper extends existing knowledge on teachers' use of technology by developing a conceptual framework that recognises how three key types of knowledge; content, pedagogy, technology, and their integration are at the crux of teachers' technology use while at the same time is amenable to empirical studies. Although the aforementioned knowledge is important for effective use of technology that can result in enhanced student engagement, literature on how this knowledge leads to effective technology use and enhanced student engagement is limited. Thus, this theoretical paper proposes a framework to explore teachers' knowledge through the lens of the Technological Pedagogical and Content Knowledge (TPACK); the integration of technology in classroom teaching through the Substitution Augmentation Modification and Redefinition (SAMR) model and how this affects students' learning through the Bloom's Digital Taxonomy (BDT) lens. Studies using this framework could inform the design of professional development to support teachers to develop skills for effective use of available technology that can enhance student learning engagement.

Keywords: information and communication technology, ICT, in-service training, small island developing states, SIDS, student engagement, technology integration, technology professional development training, technological pedagogical and content knowledge, TPACK

Procedia PDF Downloads 141
5713 From Proficiency to High Accomplishment: Transformative Inquiry and Institutionalization of Mentoring Practices in Teacher Education in South-Western Nigeria

Authors: Michael A. Ifarajimi

Abstract:

The transition from being a graduate teacher to a highly accomplished teacher has been widely portrayed in literature as challenging. Pre-service teachers are troubled with complex issues such as implementing, assessment, meeting prescribed learning outcomes, taking risks, supporting eco sustainability, etc. This list is not exhaustive as they are further complicated when the concerns extend beyond the classroom into the broader school setting and community. Meanwhile, the pre-service teacher education programme as is currently run in Nigeria, cannot adequately prepare newly trained teachers for the realities of classroom teaching. And there appears to be no formal structure in place for mentoring such teachers by the more seasoned teachers in schools. The central research question of the study, therefore, is which institutional framework can be distinguished for enactment in mentoring practices in teacher education? The study was conducted in five colleges of education in South-West Nigeria, and a sample of 1000 pre-service teachers on their final year practicum was randomly selected from the colleges of education. A pre-service teacher mentorship programme (PTMP) framework was designed and implemented, with a focus on the impact of transformative inquiry on the pre-service teacher support system. The study discovered a significant impact of mentoring on pre-service teacher’s professional transformation. The study concluded that institutionalizing mentorship through transformative inquiry is a means to sustainable teacher education, professional growth, and effective classroom practice. The study recommended that the government should enact policies that will promote mentoring in teacher education and establish a framework for the implementation of mentoring practices in the colleges of education in Nigeria.

Keywords: institutionalization, mentoring, pre-service teachers teacher education, transformative inquiry

Procedia PDF Downloads 129
5712 Automatic Detection of Traffic Stop Locations Using GPS Data

Authors: Areej Salaymeh, Loren Schwiebert, Stephen Remias, Jonathan Waddell

Abstract:

Extracting information from new data sources has emerged as a crucial task in many traffic planning processes, such as identifying traffic patterns, route planning, traffic forecasting, and locating infrastructure improvements. Given the advanced technologies used to collect Global Positioning System (GPS) data from dedicated GPS devices, GPS equipped phones, and navigation tools, intelligent data analysis methodologies are necessary to mine this raw data. In this research, an automatic detection framework is proposed to help identify and classify the locations of stopped GPS waypoints into two main categories: signalized intersections or highway congestion. The Delaunay triangulation is used to perform this assessment in the clustering phase. While most of the existing clustering algorithms need assumptions about the data distribution, the effectiveness of the Delaunay triangulation relies on triangulating geographical data points without such assumptions. Our proposed method starts by cleaning noise from the data and normalizing it. Next, the framework will identify stoppage points by calculating the traveled distance. The last step is to use clustering to form groups of waypoints for signalized traffic and highway congestion. Next, a binary classifier was applied to find distinguish highway congestion from signalized stop points. The binary classifier uses the length of the cluster to find congestion. The proposed framework shows high accuracy for identifying the stop positions and congestion points in around 99.2% of trials. We show that it is possible, using limited GPS data, to distinguish with high accuracy.

Keywords: Delaunay triangulation, clustering, intelligent transportation systems, GPS data

Procedia PDF Downloads 272
5711 The Outcome of Using Machine Learning in Medical Imaging

Authors: Adel Edwar Waheeb Louka

Abstract:

Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.

Keywords: artificial intelligence, convolutional neural networks, deeplearning, image processing, machine learningSarapin, intraarticular, chronic knee pain, osteoarthritisFNS, trauma, hip, neck femur fracture, minimally invasive surgery

Procedia PDF Downloads 65
5710 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 122
5709 Framework Development of Carbon Management Software Tool in Sustainable Supply Chain Management of Indian Industry

Authors: Sarbjit Singh

Abstract:

This framework development explored the status of GSCM in manufacturing SMEs and concluded that there was a significant gap w.r.t carbon emissions measurement in the supply chain activities. The measurement of carbon emissions within supply chains is important green initiative toward its reduction. The majority of the SMEs were facing the problem to quantify the green house gas emissions in its supply chain & to make it a low carbon supply chain or GSCM. Thus, the carbon management initiatives were amalgamated with the supply chain activities in order to measure and reduce the carbon emissions, confirming the GHG protocol scopes. Henceforth, it covers the development of carbon management software (CMS) tool to quantify carbon emissions for effective carbon management. This tool is cheap and easy to use for the industries for the management of their carbon emissions within the supply chain.

Keywords: w.r.t carbon emissions, carbon management software, supply chain management, Indian Industry

Procedia PDF Downloads 461
5708 Modular Data and Calculation Framework for a Technology-based Mapping of the Manufacturing Process According to the Value Stream Management Approach

Authors: Tim Wollert, Fabian Behrendt

Abstract:

Value Stream Management (VSM) is a widely used methodology in the context of Lean Management for improving end-to-end material and information flows from a supplier to a customer from a company’s perspective. Whereas the design principles, e.g. Pull, value-adding, customer-orientation and further ones are still valid against the background of an increasing digitalized and dynamic environment, the methodology itself for mapping a value stream is characterized as time- and resource-intensive due to the high degree of manual activities. The digitalization of processes in the context of Industry 4.0 enables new opportunities to reduce these manual efforts and make the VSM approach more agile. The paper at hand aims at providing a modular data and calculation framework, utilizing the available business data, provided by information and communication technologies for automizing the value stream mapping process with focus on the manufacturing process.

Keywords: lean management 4.0, value stream management (VSM) 4.0, dynamic value stream mapping, enterprise resource planning (ERP)

Procedia PDF Downloads 146
5707 China-Africa Diplomatic Discourse: Reconstructing the Principle of “Yi” as a Framework for Analyzing Sino-Africa Cooperation

Authors: Modestus Queen

Abstract:

As we know, diplomatic languages carry the political ideology and cultural stance of the country. Knowing that China's diplomatic discourse is complicated and is heavily flavored with Chinese characteristics, one of the core goals of President Xi's administration is to properly tell the story of China. This cannot be done without proper translation or interpretation of major Chinese diplomatic concepts. Therefore, this research seeks to interpret the relevance of "Yi" as used in "Zhèngquè Yì Lì Guān". The author argues that it is not enough to translate a document but that it must be properly interpreted to portray it as political, economic, cultural and diplomatic relevant to the target audience, in this case, African people. The first finding in the current study indicates that literal translation is a bad strategy, especially in Chinese diplomatic discourses. The second finding indicates that "Yi" can be used as a framework to analyze Sino-Africa relations from economic, social and political perspectives, and the third finding indicates that "Yi" is the guiding principle of China's foreign policy towards Africa.

Keywords: Yi, justice, China-Africa, interpretation, diplomatic discourse, discourse reconstruction

Procedia PDF Downloads 132
5706 Open Source Knowledge Management Approach to Manage and Disseminate Distributed Content in a Global Enterprise

Authors: Rahul Thakur, Onkar Chandel

Abstract:

Red Hat is the world leader in providing open source software and solutions. A global enterprise, like Red Hat, has unique issues of connecting employees with content because of distributed offices, multiple teams spread across geographies, multiple languages, and different cultures. Employees, of a global company, create content that is distributed across departments, teams, regions, and countries. This makes finding the best content difficult since owners keep iterating on the existing content. When employees are unable to find the content, they end up creating it once again and in the process duplicating existing material and effort. Also, employees may not find the relevant content and spend time reviewing obsolete duplicate, or irrelevant content. On an average, a person spends 15 minutes/day in failed searches that might result in missed business opportunities, employee frustration, and substandard deliverables. Red Hat Knowledge Management Office (KMO) applied 'open source strategy' to solve the above problems. Under the Open Source Strategy, decisions are taken collectively. The strategy aims at accomplishing common goals with the help of communities. The objectives of this initiative were to save employees' time, get them authentic content, improve their content search experience, avoid duplicate content creation, provide context based search, improve analytics, improve content management workflows, automate content classification, and automate content upload. This session will describe open source strategy, its applicability in content management, challenges, recommended solutions, and outcome.

Keywords: content classification, content management, knowledge management, open source

Procedia PDF Downloads 209
5705 Optimization Technique for the Contractor’s Portfolio in the Bidding Process

Authors: Taha Anjamrooz, Sareh Rajabi, Salwa Bheiry

Abstract:

Selection between the available projects in bidding processes for the contractor is one of the essential areas to concentrate on. It is important for the contractor to choose the right projects within its portfolio during the tendering stage based on certain criteria. It should align the bidding process with its origination strategies and goals as a screening process to have the right portfolio pool to start with. Secondly, it should set the proper framework and use a suitable technique in order to optimize its selection process for concertation purpose and higher efforts during the tender stage with goals of success and winning. In this research paper, a two steps framework proposed to increase the efficiency of the contractor’s bidding process and the winning chance of getting the new projects awarded. In this framework, initially, all the projects pass through the first stage screening process, in which the portfolio basket will be evaluated and adjusted in accordance with the organization strategies to the reduced version of the portfolio pool, which is in line with organization activities. In the second stage, the contractor uses linear programming to optimize the portfolio pool based on available resources such as manpower, light equipment, heavy equipment, financial capability, return on investment, and success rate of winning the bid. Therefore, this optimization model will assist the contractor in utilizing its internal resource to its maximum and increase its winning chance for the new project considering past experience with clients, built-relation between two parties, and complexity in the exertion of the projects. The objective of this research will be to increase the contractor's winning chance in the bidding process based on the success rate and expected return on investment.

Keywords: bidding process, internal resources, optimization, contracting portfolio management

Procedia PDF Downloads 140
5704 Deep Learning for Qualitative and Quantitative Grain Quality Analysis Using Hyperspectral Imaging

Authors: Ole-Christian Galbo Engstrøm, Erik Schou Dreier, Birthe Møller Jespersen, Kim Steenstrup Pedersen

Abstract:

Grain quality analysis is a multi-parameterized problem that includes a variety of qualitative and quantitative parameters such as grain type classification, damage type classification, and nutrient regression. Currently, these parameters require human inspection, a multitude of instruments employing a variety of sensor technologies, and predictive model types or destructive and slow chemical analysis. This paper investigates the feasibility of applying near-infrared hyperspectral imaging (NIR-HSI) to grain quality analysis. For this study two datasets of NIR hyperspectral images in the wavelength range of 900 nm - 1700 nm have been used. Both datasets contain images of sparsely and densely packed grain kernels. The first dataset contains ~87,000 image crops of bulk wheat samples from 63 harvests where protein value has been determined by the FOSS Infratec NOVA which is the golden industry standard for protein content estimation in bulk samples of cereal grain. The second dataset consists of ~28,000 image crops of bulk grain kernels from seven different wheat varieties and a single rye variety. In the first dataset, protein regression analysis is the problem to solve while variety classification analysis is the problem to solve in the second dataset. Deep convolutional neural networks (CNNs) have the potential to utilize spatio-spectral correlations within a hyperspectral image to simultaneously estimate the qualitative and quantitative parameters. CNNs can autonomously derive meaningful representations of the input data reducing the need for advanced preprocessing techniques required for classical chemometric model types such as artificial neural networks (ANNs) and partial least-squares regression (PLS-R). A comparison between different CNN architectures utilizing 2D and 3D convolution is conducted. These results are compared to the performance of ANNs and PLS-R. Additionally, a variety of preprocessing techniques from image analysis and chemometrics are tested. These include centering, scaling, standard normal variate (SNV), Savitzky-Golay (SG) filtering, and detrending. The results indicate that the combination of NIR-HSI and CNNs has the potential to be the foundation for an automatic system unifying qualitative and quantitative grain quality analysis within a single sensor technology and predictive model type.

Keywords: deep learning, grain analysis, hyperspectral imaging, preprocessing techniques

Procedia PDF Downloads 97
5703 An Agile, Intelligent and Scalable Framework for Global Software Development

Authors: Raja Asad Zaheer, Aisha Tanveer, Hafza Mehreen Fatima

Abstract:

Global Software Development (GSD) is becoming a common norm in software industry, despite of the fact that global distribution of the teams presents special issues for effective communication and coordination of the teams. Now trends are changing and project management for distributed teams is no longer in a limbo. GSD can be effectively established using agile and project managers can use different agile techniques/tools for solving the problems associated with distributed teams. Agile methodologies like scrum and XP have been successfully used with distributed teams. We have employed exploratory research method to analyze different recent studies related to challenges of GSD and their proposed solutions. In our study, we had deep insight in six commonly faced challenges: communication and coordination, temporal differences, cultural differences, knowledge sharing/group awareness, speed and communication tools. We have established that each of these challenges cannot be neglected for distributed teams of any kind. They are interlinked and as an aggregated whole can cause the failure of projects. In this paper we have focused on creating a scalable framework for detecting and overcoming these commonly faced challenges. In the proposed solution, our objective is to suggest agile techniques/tools relevant to a particular problem faced by the organizations related to the management of distributed teams. We focused mainly on scrum and XP techniques/tools because they are widely accepted and used in the industry. Our solution identifies the problem and suggests an appropriate technique/tool to help solve the problem based on globally shared knowledgebase. We can establish a cause and effect relationship using a fishbone diagram based on the inputs provided for issues commonly faced by organizations. Based on the identified cause, suitable tool is suggested, our framework suggests a suitable tool. Hence, a scalable, extensible, self-learning, intelligent framework proposed will help implement and assess GSD to achieve maximum out of it. Globally shared knowledgebase will help new organizations to easily adapt best practices set forth by the practicing organizations.

Keywords: agile project management, agile tools/techniques, distributed teams, global software development

Procedia PDF Downloads 307
5702 Realizing the Full Potential of Islamic Banking System: Proposed Suitable Legal Framework for Islamic Banking System in Tanzania

Authors: Maulana Ayoub Ali, Pradeep Kulshrestha

Abstract:

Laws of any given secular state have a huge contribution in the growth of the Islamic banking system because the system uses conventional laws to govern its activities. Therefore, the former should be ready to accommodate the latter in order to make the Islamic banking system work properly without affecting the current conventional banking system and therefore without affecting its system. Islamic financial rules have been practiced since the birth of Islam. Following the recent world economic challenges in the financial sector, a quick rebirth of the contemporary Islamic ethical banking system took place. The coming of the Islamic banking system is due to various reasons including but not limited to the failure of the interest based economy in solving financial problems around the globe. Therefore, the Islamic banking system has been adopted as an alternative banking system in order to recover the highly damaged global financial sector. But the Islamic banking system has been facing a number of challenges which hinder its smooth operation in different parts of the world. It has not been the aim of this paper to discuss other challenges rather than the legal ones, but the same was partly discussed when it was justified that it was proper to do so. Generally, there are so many things which have been discovered in the course of writing this paper. The most important part is the issue of the regulatory and supervisory framework for the Islamic banking system in Tanzania and in other nations is considered to be a crucial part for the development of the Islamic banking industry. This paper analyses what has been observed in the study on that area and recommends for necessary actions to be taken on board in a bid to make Islamic banking system reach its climax of serving the larger community by providing ethical, equitable, affordable, interest-free and society cantered banking system around the globe.

Keywords: Islamic banking, interest free banking, ethical banking, legal framework

Procedia PDF Downloads 145