Search results for: improvement of model accuracy and reliability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 23634

Search results for: improvement of model accuracy and reliability

19704 F-VarNet: Fast Variational Network for MRI Reconstruction

Authors: Omer Cahana, Maya Herman, Ofer Levi

Abstract:

Magnetic resonance imaging (MRI) is a long medical scan that stems from a long acquisition time. This length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach, such as compress sensing (CS) or parallel imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. In order to achieve that, two properties have to exist: i) the signal must be sparse under a known transform domain, ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm needs to be applied to recover the signal. While the rapid advance in the deep learning (DL) field, which has demonstrated tremendous successes in various computer vision task’s, the field of MRI reconstruction is still in an early stage. In this paper, we present an extension of the state-of-the-art model in MRI reconstruction -VarNet. We utilize VarNet by using dilated convolution in different scales, which extends the receptive field to capture more contextual information. Moreover, we simplified the sensitivity map estimation (SME), for it holds many unnecessary layers for this task. Those improvements have shown significant decreases in computation costs as well as higher accuracy.

Keywords: MRI, deep learning, variational network, computer vision, compress sensing

Procedia PDF Downloads 162
19703 Prediction of Nonlinear Torsional Behavior of High Strength RC Beams

Authors: Woo-Young Jung, Minho Kwon

Abstract:

Seismic design criteria based on performance of structures have recently been adopted by practicing engineers in response to destructive earthquakes. A simple but efficient structural-analysis tool capable of predicting both the strength and ductility is needed to analyze reinforced concrete (RC) structures under such event. A three-dimensional lattice model is developed in this study to analyze torsions in high-strength RC members. Optimization techniques for determining optimal variables in each lattice model are introduced. Pure torsion tests of RC members are performed to validate the proposed model. Correlation studies between the numerical and experimental results confirm that the proposed model is well capable of representing salient features of the experimental results.

Keywords: torsion, non-linear analysis, three-dimensional lattice, high-strength concrete

Procedia PDF Downloads 351
19702 A Predictive Model for Turbulence Evolution and Mixing Using Machine Learning

Authors: Yuhang Wang, Jorg Schluter, Sergiy Shelyag

Abstract:

The high cost associated with high-resolution computational fluid dynamics (CFD) is one of the main challenges that inhibit the design, development, and optimisation of new combustion systems adapted for renewable fuels. In this study, we propose a physics-guided CNN-based model to predict turbulence evolution and mixing without requiring a traditional CFD solver. The model architecture is built upon U-Net and the inception module, while a physics-guided loss function is designed by introducing two additional physical constraints to allow for the conservation of both mass and pressure over the entire predicted flow fields. Then, the model is trained on the Large Eddy Simulation (LES) results of a natural turbulent mixing layer with two different Reynolds number cases (Re = 3000 and 30000). As a result, the model prediction shows an excellent agreement with the corresponding CFD solutions in terms of both spatial distributions and temporal evolution of turbulent mixing. Such promising model prediction performance opens up the possibilities of doing accurate high-resolution manifold-based combustion simulations at a low computational cost for accelerating the iterative design process of new combustion systems.

Keywords: computational fluid dynamics, turbulence, machine learning, combustion modelling

Procedia PDF Downloads 91
19701 Enhancing Plant Throughput in Mineral Processing Through Multimodal Artificial Intelligence

Authors: Muhammad Bilal Shaikh

Abstract:

Mineral processing plants play a pivotal role in extracting valuable minerals from raw ores, contributing significantly to various industries. However, the optimization of plant throughput remains a complex challenge, necessitating innovative approaches for increased efficiency and productivity. This research paper investigates the application of Multimodal Artificial Intelligence (MAI) techniques to address this challenge, aiming to improve overall plant throughput in mineral processing operations. The integration of multimodal AI leverages a combination of diverse data sources, including sensor data, images, and textual information, to provide a holistic understanding of the complex processes involved in mineral extraction. The paper explores the synergies between various AI modalities, such as machine learning, computer vision, and natural language processing, to create a comprehensive and adaptive system for optimizing mineral processing plants. The primary focus of the research is on developing advanced predictive models that can accurately forecast various parameters affecting plant throughput. Utilizing historical process data, machine learning algorithms are trained to identify patterns, correlations, and dependencies within the intricate network of mineral processing operations. This enables real-time decision-making and process optimization, ultimately leading to enhanced plant throughput. Incorporating computer vision into the multimodal AI framework allows for the analysis of visual data from sensors and cameras positioned throughout the plant. This visual input aids in monitoring equipment conditions, identifying anomalies, and optimizing the flow of raw materials. The combination of machine learning and computer vision enables the creation of predictive maintenance strategies, reducing downtime and improving the overall reliability of mineral processing plants. Furthermore, the integration of natural language processing facilitates the extraction of valuable insights from unstructured textual data, such as maintenance logs, research papers, and operator reports. By understanding and analyzing this textual information, the multimodal AI system can identify trends, potential bottlenecks, and areas for improvement in plant operations. This comprehensive approach enables a more nuanced understanding of the factors influencing throughput and allows for targeted interventions. The research also explores the challenges associated with implementing multimodal AI in mineral processing plants, including data integration, model interpretability, and scalability. Addressing these challenges is crucial for the successful deployment of AI solutions in real-world industrial settings. To validate the effectiveness of the proposed multimodal AI framework, the research conducts case studies in collaboration with mineral processing plants. The results demonstrate tangible improvements in plant throughput, efficiency, and cost-effectiveness. The paper concludes with insights into the broader implications of implementing multimodal AI in mineral processing and its potential to revolutionize the industry by providing a robust, adaptive, and data-driven approach to optimizing plant operations. In summary, this research contributes to the evolving field of mineral processing by showcasing the transformative potential of multimodal artificial intelligence in enhancing plant throughput. The proposed framework offers a holistic solution that integrates machine learning, computer vision, and natural language processing to address the intricacies of mineral extraction processes, paving the way for a more efficient and sustainable future in the mineral processing industry.

Keywords: multimodal AI, computer vision, NLP, mineral processing, mining

Procedia PDF Downloads 68
19700 Optimizing Productivity and Quality through the Establishment of a Learning Management System for an Agency-Based Graduate School

Authors: Maria Corazon Tapang-Lopez, Alyn Joy Dela Cruz Baltazar, Bobby Jones Villanueva Domdom

Abstract:

The requisite for an organization implementing quality management system to sustain its compliance to the requirements and commitment for continuous improvement is even higher. It is expected that the offices and units has high and consistent compliance to the established processes and procedures. The Development Academy of the Philippines has been operating under project management to which is has a quality management certification. To further realize its mandate as a think-tank and capacity builder of the government, DAP expanded its operation and started to grant graduate degree through its Graduate School of Public and Development Management (GSPDM). As the academic arm of the Academy, GSPDM offers graduate degree programs on public management and productivity & quality aligned to the institutional trusts. For a time, the documented procedures and processes of a project management seem to fit the Graduate School. However, there has been a significant growth in the operations of the GSPDM in terms of the graduate programs offered that directly increase the number of students. There is an apparent necessity to align the project management system into a more educational system otherwise it will no longer be responsive to the development that are taking place. The strongly advocate and encourage its students to pursue internal and external improvement to cope up with the challenges of providing quality service to their own clients and to our country. If innovation will not take roots in the grounds of GSPDM, then how will it serve the purpose of “walking the talk”? This research was conducted to assess the diverse flow of the existing internal operations and processes of the DAP’s project management and GSPDM’s school management that will serve as basis to develop a system that will harmonize into one, the Learning Management System. The study documented the existing process of GSPDM following the project management phases of conceptualization & development, negotiation & contracting, mobilization, implementation, and closure into different flow charts of the key activities. The primary source of information as respondents were the different groups involved into the delivery of graduate programs - the executive, learning management team and administrative support offices. The Learning Management System (LMS) shall capture the unique and critical processes of the GSPDM as a degree-granting unit of the Academy. The LMS is the harmonized project management and school management system that shall serve as the standard system and procedure for all the programs within the GSPDM. The unique processes cover the three important areas of school management – student, curriculum, and faculty. The required processes of these main areas such as enrolment, course syllabus development, and faculty evaluation were appropriately placed within the phases of the project management system. Further, the research shall identify critical reports and generate manageable documents and records to ensure accuracy, consistency and reliable information. The researchers had an in-depth review of the DAP-GSDPM’s mandate, analyze the various documents, and conducted series of focused group discussions. A comprehensive review on flow chart system prior and various models of school management systems were made. Subsequently, the final output of the research is a work instructions manual that will be presented to the Academy’s Quality Management Council and eventually an additional scope for ISO certification. The manual shall include documented forms, iterative flow charts and program Gantt chart that will have a parallel development of automated systems.

Keywords: productivity, quality, learning management system, agency-based graduate school

Procedia PDF Downloads 319
19699 E-Governance: A Key for Improved Public Service Delivery

Authors: Ayesha Akbar

Abstract:

Public service delivery has witnessed a significant improvement with the integration of information communication technology (ICT). It not only improves management structure with advanced technology for surveillance of service delivery but also provides evidence for informed decisions and policy. Pakistan’s public sector organizations have not been able to produce some good results to ensure service delivery. Notwithstanding, some of the public sector organizations in Pakistan has diffused modern technology and proved their credence by providing better service delivery standards. These good indicators provide sound basis to integrate technology in public sector organizations and shift of policy towards evidence based policy making. Rescue-1122 is a public sector organization which provides emergency services and proved to be a successful model for the provision of service delivery to save human lives and to ensure human development in Pakistan. The information about the organization has been received by employing qualitative research methodology. The information is broadly based on primary and secondary sources which includes Rescue-1122 website, official reports of organizations; UNDP (United Nation Development Program), WHO (World Health Organization) and by conducting 10 in-depth interviews with the high administrative staff of organizations who work in the Lahore offices. The information received has been incorporated with the study for the better understanding of the organization and their management procedures. Rescue-1122 represents a successful model in delivering the services in an efficient way to deal with the disaster management. The management of Rescue has strategized the policies and procedures in such a way to develop a comprehensive model with the integration of technology. This model provides efficient service delivery as well as maintains the standards of the organization. The service delivery model of rescue-1122 works on two fronts; front-office interface and the back-office interface. Back-office defines the procedures of operations and assures the compliance of the staff whereas, front-office equipped with the latest technology and good infrastructure handles the emergency calls. Both ends are integrated with satellite based vehicle tracking, wireless system, fleet monitoring system and IP camera which monitors every move of the staff to provide better services and to pinpoint the distortions in the services. The standard time of reaching to the emergency spot is 7 minutes, and during entertaining the case; driver‘s behavior, traffic volume and the technical assistance being provided to the emergency case is being monitored by front-office. Then the whole information get uploaded to the main dashboard of Lahore headquarter from the provincial offices. The latest technology is being materialized by Rescue-1122 for delivering the efficient services, investigating the flaws; if found, and to develop data to make informed decision making. The other public sector organizations of Pakistan can also develop such models to integrate technology for improving service delivery and to develop evidence for informed decisions and policy making.

Keywords: data, e-governance, evidence, policy

Procedia PDF Downloads 247
19698 Efficacy of Technology for Successful Learning Experience; Technology Supported Model for Distance Learning: Case Study of Botho University, Botswana

Authors: Ivy Rose Mathew

Abstract:

The purpose of this study is to outline the efficacy of technology and the opportunities it can bring to implement a successful delivery model in Distance Learning. Distance Learning has proliferated over the past few years across the world. Some of the current challenges faced by current students of distance education include lack of motivation, a sense of isolation and a need for greater and improved communication. Hence the author proposes a creative technology supported model for distance learning exactly mirrored on the traditional face to face learning that can be adopted by distance learning providers. This model suggests the usage of a range of technologies and social networking facilities, with the aim of creating a more engaging and sustaining learning environment to help overcome the isolation often noted by distance learners. While discussing the possibilities, the author also highlights the complexity and practical challenges of implementing such a model. Design/methodology/approach: Theoretical issues from previous research related to successful models for distance learning providers will be considered. And also the analysis of a case study from one of the largest private tertiary institution in Botswana, Botho University will be included. This case study illustrates important aspects of the distance learning delivery model and provides insights on how curriculum development is planned, quality assurance is done, and learner support is assured for successful distance learning experience. Research limitations/implications: While some of the aspects of this study may not be applicable to other contexts, a number of new providers of distance learning can adapt the key principles of this delivery model.

Keywords: distance learning, efficacy, learning experience, technology supported model

Procedia PDF Downloads 247
19697 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features

Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh

Abstract:

In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.

Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve

Procedia PDF Downloads 262
19696 On the Use of Analytical Performance Models to Design a High-Performance Active Queue Management Scheme

Authors: Shahram Jamali, Samira Hamed

Abstract:

One of the open issues in Random Early Detection (RED) algorithm is how to set its parameters to reach high performance for the dynamic conditions of the network. Although original RED uses fixed values for its parameters, this paper follows a model-based approach to upgrade performance of the RED algorithm. It models the routers queue behavior by using the Markov model and uses this model to predict future conditions of the queue. This prediction helps the proposed algorithm to make some tunings over RED's parameters and provide efficiency and better performance. Widespread packet level simulations confirm that the proposed algorithm, called Markov-RED, outperforms RED and FARED in terms of queue stability, bottleneck utilization and dropped packets count.

Keywords: active queue management, RED, Markov model, random early detection algorithm

Procedia PDF Downloads 539
19695 Emotion-Convolutional Neural Network for Perceiving Stress from Audio Signals: A Brain Chemistry Approach

Authors: Anup Anand Deshmukh, Catherine Soladie, Renaud Seguier

Abstract:

Emotion plays a key role in many applications like healthcare, to gather patients’ emotional behavior. Unlike typical ASR (Automated Speech Recognition) problems which focus on 'what was said', it is equally important to understand 'how it was said.' There are certain emotions which are given more importance due to their effectiveness in understanding human feelings. In this paper, we propose an approach that models human stress from audio signals. The research challenge in speech emotion detection is finding the appropriate set of acoustic features corresponding to an emotion. Another difficulty lies in defining the very meaning of emotion and being able to categorize it in a precise manner. Supervised Machine Learning models, including state of the art Deep Learning classification methods, rely on the availability of clean and labelled data. One of the problems in affective computation is the limited amount of annotated data. The existing labelled emotions datasets are highly subjective to the perception of the annotator. We address the first issue of feature selection by exploiting the use of traditional MFCC (Mel-Frequency Cepstral Coefficients) features in Convolutional Neural Network. Our proposed Emo-CNN (Emotion-CNN) architecture treats speech representations in a manner similar to how CNN’s treat images in a vision problem. Our experiments show that Emo-CNN consistently and significantly outperforms the popular existing methods over multiple datasets. It achieves 90.2% categorical accuracy on the Emo-DB dataset. We claim that Emo-CNN is robust to speaker variations and environmental distortions. The proposed approach achieves 85.5% speaker-dependant categorical accuracy for SAVEE (Surrey Audio-Visual Expressed Emotion) dataset, beating the existing CNN based approach by 10.2%. To tackle the second problem of subjectivity in stress labels, we use Lovheim’s cube, which is a 3-dimensional projection of emotions. Monoamine neurotransmitters are a type of chemical messengers in the brain that transmits signals on perceiving emotions. The cube aims at explaining the relationship between these neurotransmitters and the positions of emotions in 3D space. The learnt emotion representations from the Emo-CNN are mapped to the cube using three component PCA (Principal Component Analysis) which is then used to model human stress. This proposed approach not only circumvents the need for labelled stress data but also complies with the psychological theory of emotions given by Lovheim’s cube. We believe that this work is the first step towards creating a connection between Artificial Intelligence and the chemistry of human emotions.

Keywords: deep learning, brain chemistry, emotion perception, Lovheim's cube

Procedia PDF Downloads 154
19694 Singularization: A Technique for Protecting Neural Networks

Authors: Robert Poenaru, Mihail Pleşa

Abstract:

In this work, a solution that addresses the protection of pre-trained neural networks is developed: Singularization. This method involves applying permutations to the weight matrices of a pre-trained model, introducing a form of structured noise that obscures the original model’s architecture. These permutations make it difficult for an attacker to reconstruct the original model, even if the permuted weights are obtained. Experimental benchmarks indicate that the application of singularization has a profound impact on model performance, often degrading it to the point where retraining from scratch becomes necessary to recover functionality, which is particularly effective for securing intellectual property in neural networks. Moreover, unlike other approaches, singularization is lightweight and computationally efficient, which makes it well suited for resource-constrained environments. Our experiments also demonstrate that this technique performs efficiently in various image classification tasks, highlighting its broad applicability and practicality in real-world scenarios.

Keywords: machine learning, ANE, CNN, security

Procedia PDF Downloads 14
19693 An Effort at Improving Reliability of Laboratory Data in Titrimetric Analysis for Zinc Sulphate Tablets Using Validated Spreadsheet Calculators

Authors: M. A. Okezue, K. L. Clase, S. R. Byrn

Abstract:

The requirement for maintaining data integrity in laboratory operations is critical for regulatory compliance. Automation of procedures reduces incidence of human errors. Quality control laboratories located in low-income economies may face some barriers in attempts to automate their processes. Since data from quality control tests on pharmaceutical products are used in making regulatory decisions, it is important that laboratory reports are accurate and reliable. Zinc Sulphate (ZnSO4) tablets is used in treatment of diarrhea in pediatric population, and as an adjunct therapy for COVID-19 regimen. Unfortunately, zinc content in these formulations is determined titrimetrically; a manual analytical procedure. The assay for ZnSO4 tablets involves time-consuming steps that contain mathematical formulae prone to calculation errors. To achieve consistency, save costs, and improve data integrity, validated spreadsheets were developed to simplify the two critical steps in the analysis of ZnSO4 tablets: standardization of 0.1M Sodium Edetate (EDTA) solution, and the complexometric titration assay procedure. The assay method in the United States Pharmacopoeia was used to create a process flow for ZnSO4 tablets. For each step in the process, different formulae were input into two spreadsheets to automate calculations. Further checks were created within the automated system to ensure validity of replicate analysis in titrimetric procedures. Validations were conducted using five data sets of manually computed assay results. The acceptance criteria set for the protocol were met. Significant p-values (p < 0.05, α = 0.05, at 95% Confidence Interval) were obtained from students’ t-test evaluation of the mean values for manual-calculated and spreadsheet results at all levels of the analysis flow. Right-first-time analysis and principles of data integrity were enhanced by use of the validated spreadsheet calculators in titrimetric evaluations of ZnSO4 tablets. Human errors were minimized in calculations when procedures were automated in quality control laboratories. The assay procedure for the formulation was achieved in a time-efficient manner with greater level of accuracy. This project is expected to promote cost savings for laboratory business models.

Keywords: data integrity, spreadsheets, titrimetry, validation, zinc sulphate tablets

Procedia PDF Downloads 169
19692 Prediction of the Tunnel Fire Flame Length by Hybrid Model of Neural Network and Genetic Algorithms

Authors: Behzad Niknam, Kourosh Shahriar, Hassan Madani

Abstract:

This paper demonstrates the applicability of Hybrid Neural Networks that combine with back propagation networks (BPN) and Genetic Algorithms (GAs) for predicting the flame length of tunnel fire A hybrid neural network model has been developed to predict the flame length of tunnel fire based parameters such as Fire Heat Release rate, air velocity, tunnel width, height and cross section area. The network has been trained with experimental data obtained from experimental work. The hybrid neural network model learned the relationship for predicting the flame length in just 3000 training epochs. After successful learning, the model predicted the flame length.

Keywords: tunnel fire, flame length, ANN, genetic algorithm

Procedia PDF Downloads 643
19691 Structural Equation Modeling Exploration for the Multiple College Admission Criteria in Taiwan

Authors: Tzu-Ling Hsieh

Abstract:

When the Taiwan Ministry of Education implemented a new university multiple entrance policy in 2002, most colleges and universities still use testing scores as mainly admission criteria. With forthcoming 12 basic-year education curriculum, the Ministry of Education provides a new college admission policy, which will be implemented in 2021. The new college admission policy will highlight the importance of holistic education by more emphases on the learning process of senior high school, except only on the outcome of academic testing. However, the development of college admission criteria doesn’t have a thoughtful process. Universities and colleges don’t have an idea about how to make suitable multi-admission criteria. Although there are lots of studies in other countries which have implemented multi-college admission criteria for years, these studies still cannot represent Taiwanese students. Also, these studies are limited without the comparison of two different academic fields. Therefore, this study investigated multiple admission criteria and its relationship with college success. This study analyzed the Taiwan Higher Education Database with 12,747 samples from 156 universities and tested a conceptual framework that examines factors by structural equation model (SEM). The conceptual framework of this study was adapted from Pascarella's general causal model and focused on how different admission criteria predict students’ college success. It discussed the relationship between admission criteria and college success, also the relationship how motivation (one of admission standard) influence college success through engagement behaviors of student effort and interactions with agents of socialization. After processing missing value, reliability and validity analysis, the study found three indicators can significantly predict students’ college success which was defined as average grade of last semester. These three indicators are the Chinese language scores at college entrance exam, high school class rank, and quality of student academic engagement. In addition, motivation can significantly predict quality of student academic engagement and interactions with agents of socialization. However, the multi-group SEM analysis showed that there is no difference to predict college success between the students from liberal arts and science. Finally, this study provided some suggestions for universities and colleges to develop multi-admission criteria through the empirical research of Taiwanese higher education students.

Keywords: college admission, admission criteria, structural equation modeling, higher education, education policy

Procedia PDF Downloads 178
19690 Modelling Volatility Spillovers and Cross Hedging among Major Agricultural Commodity Futures

Authors: Roengchai Tansuchat, Woraphon Yamaka, Paravee Maneejuk

Abstract:

From the past recent, the global financial crisis, economic instability, and large fluctuation in agricultural commodity price have led to increased concerns about the volatility transmission among them. The problem is further exacerbated by commodities volatility caused by other commodity price fluctuations, hence the decision on hedging strategy has become both costly and useless. Thus, this paper is conducted to analysis the volatility spillover effect among major agriculture including corn, soybeans, wheat and rice, to help the commodity suppliers hedge their portfolios, and manage the risk and co-volatility of them. We provide a switching regime approach to analyzing the issue of volatility spillovers in different economic conditions, namely upturn and downturn economic. In particular, we investigate relationships and volatility transmissions between these commodities in different economic conditions. We purposed a Copula-based multivariate Markov Switching GARCH model with two regimes that depend on an economic conditions and perform simulation study to check the accuracy of our proposed model. In this study, the correlation term in the cross-hedge ratio is obtained from six copula families – two elliptical copulas (Gaussian and Student-t) and four Archimedean copulas (Clayton, Gumbel, Frank, and Joe). We use one-step maximum likelihood estimation techniques to estimate our models and compare the performance of these copula using Akaike information criterion (AIC) and Bayesian information criteria (BIC). In the application study of agriculture commodities, the weekly data used are conducted from 4 January 2005 to 1 September 2016, covering 612 observations. The empirical results indicate that the volatility spillover effects among cereal futures are different, as response of different economic condition. In addition, the results of hedge effectiveness will also suggest the optimal cross hedge strategies in different economic condition especially upturn and downturn economic.

Keywords: agricultural commodity futures, cereal, cross-hedge, spillover effect, switching regime approach

Procedia PDF Downloads 202
19689 Studying the Influence of Logistics on Organizational Performance through a Supply Chain Strategy: Case Study in Goldiran Electronics Co.

Authors: Ali Hajiesmaeili, Mehdi Rahimi, Ehsan Jaberi, Amir Abbas Hosseini

Abstract:

The purpose of this study is investigating the influences of logistics performance on organizational performance including both marketing & financial aspects, and showing the financial impacts of selecting the right marketing and logistics priorities in line with their supply chain type, and also giving the practitioners an advance identification of their priorities and participation types of supply chain, and the best combination of their strategies and resources in this regard. We made use of the original model’s questionnaire to gather all expert’s data and also SPSS and AMOS Ver.22 to analyze the gathered data. CFA method was also used to test whether a relationship between observed variables and their underlying latent constructs exists. Supply chain strategy implementation leads to logistics performance improvement, and marketing performance will be affected as well. Logistics service providers should focus on enhancement of supply chain performance, since logistics performance has been considered as a basis of evaluation of supply chain management strategy. Consequently, performance of the organization will be enhanced. This case is the first research made in Iran that analyzes the relationship between Logistics & Organizational performance in Home Appliances and Home Entertainment companies.

Keywords: logistics, organizational, performance, supply chain, strategy

Procedia PDF Downloads 649
19688 Improving Junior Doctor Induction Through the Use of Simple In-House Mobile Application

Authors: Dmitriy Chernov, Maria Karavassilis, Suhyoun Youn, Amna Izhar, Devasenan Devendra

Abstract:

Introduction and Background: A well-structured and comprehensive departmental induction improves patient safety and job satisfaction amongst doctors. The aims of our Project were as follows: 1. Assess the perceived preparedness of junior doctors starting their rotation in Acute Medicine at Watford General Hospital. 2. Develop a supplemental Induction Guide and Pocket reference in the form of an iOS mobile application. 3. To collect feedback after implementing the mobile application following a trial period of 8 weeks with a small cohort of junior doctors. Materials and Methods: A questionnaire was distributed to all new junior trainees starting in the department of Acute Medicine to assess their experience of current induction. A mobile Induction application was developed and trialled over a period of 8 weeks, distributed in addition to the existing didactic induction session. After the trial period, the same questionnaire was distributed to assess improvement in induction experience. Analytics data were collected with users’ consent to gauge user engagement and identify areas of improvement of the application. A feedback survey about the app was also distributed. Results: A total of 32 doctors used the application during the 8-week trial period. The application was accessed 7259 times in total, with the average user spending a cumulative of 37 minutes 22 seconds on the app. The most used section was Clinical Guidelines, accessed 1490 times. The App Feedback survey revealed positive reviews: 100% of participants (n=15/15) responded that the app improved their overall induction experience compared to other placements; 93% (n=14/15) responded that the app improved overall efficiency in completing daily ward jobs compared to previous rotations; and 93% (n=14/15) responded that the app improved patient safety overall. In the Pre-App and Post-App Induction Surveys, participants reported: a 48% improvement in awareness of practical aspects of the job; a 26% improvement of awareness on locating pathways and clinical guidelines; a 40% reduction of feelings of overwhelmingness. Conclusions and recommendations: This study demonstrates the importance of technology in Medical Education and Clinical Induction. The mobile application average engagement time equates to over 20 cumulative hours of on-the-job training delivered to each user, within an 8-week period. The most used and referred to section was clinical guidelines. This shows that there is high demand for an accessible pocket guide for this type of material. This simple mobile application resulted in a significant improvement in feedback about induction in our Department of Acute Medicine, and will likely impact workplace satisfaction. Limitations of the application include: post-app surveys had a small number of participants; the app is currently only available for iPhone users; some useful sections are nested deep within the app, lacks deep search functionality across all sections; lacks real time user feedback; and requires regular review and updates. Future steps for the app include: developing a web app, with an admin dashboard to simplify uploading and editing content; a comprehensive search functionality; and a user feedback and peer ratings system.

Keywords: mobile app, doctor induction, medical education, acute medicine

Procedia PDF Downloads 86
19687 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications

Authors: Atish Bagchi, Siva Chandrasekaran

Abstract:

Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.

Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning

Procedia PDF Downloads 150
19686 Biomechanical Performance of the Synovial Capsule of the Glenohumeral Joint with a BANKART Lesion through Finite Element Analysis

Authors: Duvert A. Puentes T., Javier A. Maldonado E., Ivan Quintero., Diego F. Villegas

Abstract:

Mechanical Computation is a great tool to study the performance of complex models. An example of it is the study of the human body structure. This paper took advantage of different types of software to make a 3D model of the glenohumeral joint and apply a finite element analysis. The main objective was to study the change in the biomechanical properties of the joint when it presents an injury. Specifically, a BANKART lesion, which consists in the detachment of the anteroinferior labrum from the glenoid. Stress and strain distribution of the soft tissues were the focus of this study. First, a 3D model was made of a joint without any pathology, as a control sample, using segmentation software for the bones with the support of medical imagery and a cadaveric model to represent the soft tissue. The joint was built to simulate a compression and external rotation test using CAD to prepare the model in the adequate position. When the healthy model was finished, it was submitted to a finite element analysis and the results were validated with experimental model data. With the validated model, it was sensitized to obtain the best mesh measurement. Finally, the geometry of the 3D model was changed to imitate a BANKART lesion. Then, the contact zone of the glenoid with the labrum was slightly separated simulating a tissue detachment. With this new geometry, the finite element analysis was applied again, and the results were compared with the control sample created initially. With the data gathered, this study can be used to improve understanding of the labrum tears. Nevertheless, it is important to remember that the computational analysis are approximations and the initial data was taken from an in vitro assay.

Keywords: biomechanics, computational model, finite elements, glenohumeral joint, bankart lesion, labrum

Procedia PDF Downloads 161
19685 Automated Feature Detection and Matching Algorithms for Breast IR Sequence Images

Authors: Chia-Yen Lee, Hao-Jen Wang, Jhih-Hao Lai

Abstract:

In recent years, infrared (IR) imaging has been considered as a potential tool to assess the efficacy of chemotherapy and early detection of breast cancer. Regions of tumor growth with high metabolic rate and angiogenesis phenomenon lead to the high temperatures. Observation of differences between the heat maps in long term is useful to help assess the growth of breast cancer cells and detect breast cancer earlier, wherein the multi-time infrared image alignment technology is a necessary step. Representative feature points detection and matching are essential steps toward the good performance of image registration and quantitative analysis. However, there is no clear boundary on the infrared images and the subject's posture are different for each shot. It cannot adhesive markers on a body surface for a very long period, and it is hard to find anatomic fiducial markers on a body surface. In other words, it’s difficult to detect and match features in an IR sequence images. In this study, automated feature detection and matching algorithms with two type of automatic feature points (i.e., vascular branch points and modified Harris corner) are developed respectively. The preliminary results show that the proposed method could identify the representative feature points on the IR breast images successfully of 98% accuracy and the matching results of 93% accuracy.

Keywords: Harris corner, infrared image, feature detection, registration, matching

Procedia PDF Downloads 304
19684 Development of Pothole Management Method Using Automated Equipment with Multi-Beam Sensor

Authors: Sungho Kim, Jaechoul Shin, Yujin Baek, Nakseok Kim, Kyungnam Kim, Shinhaeng Jo

Abstract:

The climate change and increase in heavy traffic have been accelerating damages that cause the problems such as pothole on asphalt pavement. Pothole causes traffic accidents, vehicle damages, road casualties and traffic congestion. A quick and efficient maintenance method is needed because pothole is caused by stripping and accelerates pavement distress. In this study, we propose a rapid and systematic pothole management by developing a pothole automated repairing equipment including a volume measurement system of pothole. Three kinds of cold mix asphalt mixture were investigated to select repair materials. The materials were evaluated for satisfaction with quality standard and applicability to automated equipment. The volume measurement system of potholes was composed of multi-sensor that are combined with laser sensor and ultrasonic sensor and installed in front and side of the automated repair equipment. An algorithm was proposed to calculate the amount of repair material according to the measured pothole volume, and the system for releasing the correct amount of material was developed. Field test results showed that the loss of repair material amount could be reduced from approximately 20% to 6% per one point of pothole. Pothole rapid automated repair equipment will contribute to improvement on quality and efficient and economical maintenance by not only reducing materials and resources but also calculating appropriate materials. Through field application, it is possible to improve the accuracy of pothole volume measurement, to correct the calculation of material amount, and to manage the pothole data of roads, thereby enabling more efficient pavement maintenance management. Acknowledgment: The author would like to thank the MOLIT(Ministry of Land, Infrastructure, and Transport). This work was carried out through the project funded by the MOLIT. The project name is 'development of 20mm grade for road surface detecting roadway condition and rapid detection automation system for removal of pothole'.

Keywords: automated equipment, management, multi-beam sensor, pothole

Procedia PDF Downloads 224
19683 A Quantitative Study on the “Unbalanced Phenomenon” of Mixed-Use Development in the Central Area of Nanjing Inner City Based on the Meta-Dimensional Model

Authors: Yang Chen, Lili Fu

Abstract:

Promoting urban regeneration in existing areas has been elevated to a national strategy in China. In this context, because of the multidimensional sustainable effect through the intensive use of land, mixed-use development has become an important objective for high-quality urban regeneration in the inner city. However, in the long period of time since China's reform and opening up, the "unbalanced phenomenon" of mixed-use development in China's inner cities has been very serious. On the one hand, the excessive focus on certain individual spaces has led to an increase in the level of mixed-use development in some areas, substantially ahead of others, resulting in a growing gap between different parts of the inner city; On the other hand, the excessive focus on a one-dimensional element of the spatial organization of mixed-use development, such as the enhancement of functional mix or spatial capacity, has led to a lagging phenomenon or neglect in the construction of other dimensional elements, such as pedestrian permeability, green environmental quality, social inclusion, etc. This phenomenon is particularly evident in the central area of the inner city, and it clearly runs counter to the need for sustainable development in China's new era. Therefore, a rational qualitative and quantitative analysis of the "unbalanced phenomenon" will help to identify the problem and provide a basis for the formulation of relevant optimization plans in the future. This paper builds a dynamic evaluation method of mixed-use development based on a meta-dimensional model and then uses spatial evolution analysis and spatial consistency analysis with ArcGIS software to reveal the "unbalanced phenomenon " in over the past 40 years of the central city area in Nanjing, a China’s typical city facing regeneration. This study result finds that, compared to the increase in functional mix and capacity, the dimensions of residential space mix, public service facility mix, pedestrian permeability, and greenness in Nanjing's city central area showed different degrees of lagging improvement, and the unbalanced development problems in each part of the city center are different, so the governance and planning plan for future mixed-use development needs to fully address these problems. The research methodology of this paper provides a tool for comprehensive dynamic identification of mixed-use development level’s change, and the results deepen the knowledge of the evolution of mixed-use development patterns in China’s inner cities and provide a reference basis for future regeneration practices.

Keywords: mixed-use development, unbalanced phenomenon, the meta-dimensional model, over the past 40 years of Nanjing, China

Procedia PDF Downloads 104
19682 A Model for Reverse-Mentoring in Education

Authors: Sabine A. Zauchner-Studnicka

Abstract:

As the term indicates, reverse-mentoring flips the classical roles of mentoring: In school, students take over the role of mentors for adults, i.e. teachers or parents. Originally reverse-mentoring stems from US enterprises, which implemented this innovative method in order to benefit from the resources of skilled younger employees for the enhancement of IT competences of senior colleagues. However, reverse-mentoring in schools worldwide is rare. Based on empirical studies and theoretical approaches, in this article an implementation model for reverse-mentoring is developed in order to bring the significant potential reverse-mentoring has for education into practice.

Keywords: reverse-mentoring, innovation in education, implementation model, school education

Procedia PDF Downloads 248
19681 Surviral: An Agent-Based Simulation Framework for Sars-Cov-2 Outcome Prediction

Authors: Sabrina Neururer, Marco Schweitzer, Werner Hackl, Bernhard Tilg, Patrick Raudaschl, Andreas Huber, Bernhard Pfeifer

Abstract:

History and the current outbreak of Covid-19 have shown the deadly potential of infectious diseases. However, infectious diseases also have a serious impact on areas other than health and healthcare, such as the economy or social life. These areas are strongly codependent. Therefore, disease control measures, such as social distancing, quarantines, curfews, or lockdowns, have to be adopted in a very considerate manner. Infectious disease modeling can support policy and decision-makers with adequate information regarding the dynamics of the pandemic and therefore assist in planning and enforcing appropriate measures that will prevent the healthcare system from collapsing. In this work, an agent-based simulation package named “survival” for simulating infectious diseases is presented. A special focus is put on SARS-Cov-2. The presented simulation package was used in Austria to model the SARS-Cov-2 outbreak from the beginning of 2020. Agent-based modeling is a relatively recent modeling approach. Since our world is getting more and more complex, the complexity of the underlying systems is also increasing. The development of tools and frameworks and increasing computational power advance the application of agent-based models. For parametrizing the presented model, different data sources, such as known infections, wastewater virus load, blood donor antibodies, circulating virus variants and the used capacity for hospitalization, as well as the availability of medical materials like ventilators, were integrated with a database system and used. The simulation result of the model was used for predicting the dynamics and the possible outcomes and was used by the health authorities to decide on the measures to be taken in order to control the pandemic situation. The survival package was implemented in the programming language Java and the analytics were performed with R Studio. During the first run in March 2020, the simulation showed that without measures other than individual personal behavior and appropriate medication, the death toll would have been about 27 million people worldwide within the first year. The model predicted the hospitalization rates (standard and intensive care) for Tyrol and South Tyrol with an accuracy of about 1.5% average error. They were calculated to provide 10-days forecasts. The state government and the hospitals were provided with the 10-days models to support their decision-making. This ensured that standard care was maintained for as long as possible without restrictions. Furthermore, various measures were estimated and thereafter enforced. Among other things, communities were quarantined based on the calculations while, in accordance with the calculations, the curfews for the entire population were reduced. With this framework, which is used in the national crisis team of the Austrian province of Tyrol, a very accurate model could be created on the federal state level as well as on the district and municipal level, which was able to provide decision-makers with a solid information basis. This framework can be transferred to various infectious diseases and thus can be used as a basis for future monitoring.

Keywords: modelling, simulation, agent-based, SARS-Cov-2, COVID-19

Procedia PDF Downloads 174
19680 Development of 3D Neck Muscle to Analyze the Effect of Active Muscle Contraction in Whiplash Injury

Authors: Nisha Nandlal Sharma, Julaluk Carmai, Saiprasit Koetniyom, Bernd Markert

Abstract:

Whiplash Injuries are mostly experienced in car accidents. Symptoms of whiplash are commonly reported in studies, neck pain and headaches are two most common symptoms observed. The whiplash Injury mechanism is poorly understood. In present study, hybrid neck muscle model were developed with a combination of solid tetrahedral elements and 1D beam elements. Solid tetrahedral elements represents passive part of the muscle whereas, 1D beam elements represents active part. To simulate the active behavior of the muscle, Hill-type muscle model was applied to beam elements. To simulate non-linear passive properties of muscle, solid elements were modeled with rubber/foam material model. Some important muscles were then inserted into THUMS (Total Human Model for Safety) THUMS was given a boundary conditions similar to experimental tests. The model was exposed to 4g and 7g rear impacts as these load impacts are close to low speed impacts causing whiplash. The effect of muscle activation level on occupant kinematics during whiplash was analyzed.

Keywords: finite element model, muscle activation, THUMS, whiplash injury mechanism

Procedia PDF Downloads 334
19679 Forecasting Electricity Spot Price with Generalized Long Memory Modeling: Wavelet and Neural Network

Authors: Souhir Ben Amor, Heni Boubaker, Lotfi Belkacem

Abstract:

This aims of this paper is to forecast the electricity spot prices. First, we focus on modeling the conditional mean of the series so we adopt a generalized fractional -factor Gegenbauer process (k-factor GARMA). Secondly, the residual from the -factor GARMA model has used as a proxy for the conditional variance; these residuals were predicted using two different approaches. In the first approach, a local linear wavelet neural network model (LLWNN) has developed to predict the conditional variance using the Back Propagation learning algorithms. In the second approach, the Gegenbauer generalized autoregressive conditional heteroscedasticity process (G-GARCH) has adopted, and the parameters of the k-factor GARMA-G-GARCH model has estimated using the wavelet methodology based on the discrete wavelet packet transform (DWPT) approach. The empirical results have shown that the k-factor GARMA-G-GARCH model outperform the hybrid k-factor GARMA-LLWNN model, and find it is more appropriate for forecasts.

Keywords: electricity price, k-factor GARMA, LLWNN, G-GARCH, forecasting

Procedia PDF Downloads 231
19678 Patient Outcomes Following Out-of-Hospital Cardiac Arrest

Authors: Scott Ashby, Emily Granger, Mark Connellan

Abstract:

Background: In-hospital management of Out-of-Hospital Cardiac Arrest (OHCA) is complex as the aetiologies are varied. Acute coronary angiography has been shown to improve outcomes for patients with coronary occlusion as the cause; however, these patients are difficult to identify. ECG results may help identify these patients, but the accuracy of this diagnostic test is under debate, and requires further investigation. Methods: Arrest and hospital management information was collated retrospectively for OHCA patients who presented to a single clinical site between 2009 and 2013. Angiography results were then collected and checked for significance with survival to discharge. The presence of a severe lesion (>70%) was then compared to categorised ECG findings, and the accuracy of the test was calculated. Results: 104 patients were included in this study, 44 survived to discharge, 52 died and 8 were transferred to other clinical sites. Angiography appears to significantly correlate with survival to discharge. ECG showed 54.8% sensitivity for detecting the presence of a severe lesion within the group that received angiography. A combined criterion including any ECG pathology showed 100% sensitivity and negative predictive value, however, a low specificity and positive predictive value. Conclusion: In the cohort investigated, ST elevation on ECG is not a sensitive enough screening test to be used to determine whether OHCA patients have coronary stenosis as the likely cause of their arrest, and more investigation into whether screening with a combined ECG criterion, or whether all patients should receive angiography routinely following OHCA is needed.

Keywords: out of hospital cardiac arrest, coronary angiography, resuscitation, emergency medicine

Procedia PDF Downloads 395
19677 A Performance Model for Designing Network in Reverse Logistic

Authors: S. Dhib, S. A. Addouche, T. Loukil, A. Elmhamedi

Abstract:

In this paper, a reverse supply chain network is investigated for a decision making. This decision is surrounded by complex flows of returned products, due to the increasing quantity, the type of returned products and the variety of recovery option products (reuse, recycling, and refurbishment). The most important problem in the reverse logistic network (RLN) is to orient returned products to the suitable type of recovery option. However, returned products orientations from collect sources to the recovery disposition have not well considered in performance model. In this study, we propose a performance model for designing a network configuration on reverse logistics. Conceptual and analytical models are developed with taking into account operational, economic and environmental factors on designing network.

Keywords: reverse logistics, network design, performance model, open loop configuration

Procedia PDF Downloads 435
19676 The Effect of Arbitrary Support Conditions on the Static Behavior of Curved Beams Using the Finite Element Method

Authors: Hossein Mottaghi T., Amir R. Masoodi

Abstract:

This study presents a finite curved element for analyzing the static behavior of curved beams within the elastic range. The objective is to enhance accuracy while reducing the number of elements by incorporating first-order shear deformations of Timoshenko beams. Initially, finite element formulations are developed by considering polynomial initial functions for axial, shear, and rotational deformations for a three-node element. Subsequently, nodal interpolation functions for this element are derived, followed by the construction of the element stiffness matrix. To enable the utilization of the stiffness matrix in the static analysis of curved beams, the constructed matrix in the local coordinates of the element is transformed to the global coordinate system using the rotation matrix. A numerical benchmark example is investigated to assess the accuracy and effectiveness of this method. Moreover, the influence of spring stiffness on the rotation of the endpoint of a clamped beam is examined by substituting each support reaction of the beam with a spring. In the parametric study, the effect of the central angle of the beam on the rotation of the beam's endpoint in a cantilever beam under a concentrated load is examined. This research encompasses various mechanical, geometrical, and boundary configurations to evaluate the static characteristics of curved beams, thus providing valuable insights for their analysis and examination.

Keywords: curved beam, finite element method, first-order shear deformation theory, elastic support

Procedia PDF Downloads 72
19675 Developing a Mathematical Model for Trade-Off Analysis of New Green Products

Authors: M. R. Gholizadeh, N. Bhuiyan, M. Salari

Abstract:

In the near future, companies will be increasingly forced to shift their activities along a new road in order to decrease the harmful effects of their design, production and after-life on our environment. Products must meet environmental standards to not only prevent penalties but to consider the sustainability for future generations. However, the most important factor that companies will face is selecting a reasonable strategy to maximize their profit. Thus, companies need to have precise forecast from their profit after design stage through Trade-off analysis. This paper is an attempt to introduce a mathematical model that considers effective factors that impact the total profit when products are designed for resource and energy efficiency or recyclability. The modification is according to different strategies based on a Cost-Volume-Profit model. Here, the cost structure consists of Recycling cost, Development cost, Ramp-up cost, Production cost, and Pollution cost. Also, the model shows the effect of implementation of design for recyclable on revenue structure through revenue of used parts and revenue of recycled materials. A numerical example is used to evaluate the proposed model. Results show that fulfillment of Green Product Development not only can reduce the environmental impact of products but also it will increase profit of company in long term.

Keywords: green product, design for environment, C-V-P model, trade-off analysis

Procedia PDF Downloads 316