Search results for: management models
13561 General Mathematical Framework for Analysis of Cattle Farm System
Authors: Krzysztof Pomorski
Abstract:
In the given work we present universal mathematical framework for modeling of cattle farm system that can set and validate various hypothesis that can be tested against experimental data. The presented work is preliminary but it is expected to be valid tool for future deeper analysis that can result in new class of prediction methods allowing early detection of cow dieseaes as well as cow performance. Therefore the presented work shall have its meaning in agriculture models and in machine learning as well. It also opens the possibilities for incorporation of certain class of biological models necessary in modeling of cow behavior and farm performance that might include the impact of environment on the farm system. Particular attention is paid to the model of coupled oscillators that it the basic building hypothesis that can construct the model showing certain periodic or quasiperiodic behavior.Keywords: coupled ordinary differential equations, cattle farm system, numerical methods, stochastic differential equations
Procedia PDF Downloads 14513560 Engineering Management and Practice in Nigeria
Authors: Harold Jideofor
Abstract:
The application of Project Management (PM) tools and techniques in the public sector is gradually becoming an important issue in developing economies, especially in a country like Nigeria where projects of different size and structures are undertaken. The paper examined the application of the project management practice in the public sector in Nigeria. The PM lifecycles, tools, and techniques were presented. The study was carried out in Lagos because of its metropolitan nature and rapidly growing economy. Twenty-three copies of questionnaire were administered to 23 public institutions in Lagos to generate primary data. The descriptive analysis techniques using percentages and table presentations coupled with the coefficient of correlation were used for data analysis. The study revealed that application of PM tools and techniques is an essential management approach that tends to achieve specified objectives within specific time and budget limits through the optimum use of resources. Furthermore, the study noted that there is a lack of in-depth knowledge of PM tools and techniques in public sector institutions sampled, also a high cost of the application was also observed by the respondents. The study recommended among others that PM tools and techniques should be applied gradually especially in old government institutions where resistance to change is perceived to be high.Keywords: project management, public sector, practice, Nigeria
Procedia PDF Downloads 34213559 Mitigation of Profitable Problems: Level of Hotel Quality Management Program and Environmental Management Practices Towards Performance
Authors: Siti Anis Nadia Abu Bakar, Vani Tanggamani
Abstract:
Over recent years, the quality and environmental management practices are the necessary tasks in hospitality industry in order to provide high quality services, a comfortable and safe environment for occupants as well as innovative nature and shareholders' satisfaction, its environmental and social added value sustainable. Numerous studies have observed and measured quality management program (QMProg) and environmental management practices (EMPrac) independently. This paper analyzed the level of QMProg, and EMPrac in hospitality industry, particularly on hotel performance, specifically in the context of Malaysia as hotel industry in Malaysia has contributed tremendously to the development in the Malaysia tourism industry.The research objectives are; (1) to analyze how the level of QMProg influences on firm performance; (2) to investigate the level of EMPrac and its influence on firm performance. This paper contributes to the literature by providing added-value to the service industry strategic decision-making processes by helping to predict the varying impacts of positive and negative corporate social responsibility (CSR) activities on financial performance in their respective industries. Further, this paper also contributes to develop more applicable CSR strategies. As a matter of fact, the findings of this paper has contributed towards an integrated management system that will assist a firm in implementation of their environmental strategy by creating a higher level of accountability for environmental performance. The best results in environmental systems have instigated managers to explore more options when dealing with problems, especially problems involving the reputation of their hotel. In conclusion, the results of the study infer that the best CSR strategies of the quality and environmental management practices influences hotel performance.Keywords: corporate social responsibility (CSR), environmental management practices (EMPrac), performance (PERF), quality management program (QMProg)
Procedia PDF Downloads 37413558 Project Objective Structure Model: An Integrated, Systematic and Balanced Approach in Order to Achieve Project Objectives
Authors: Mohammad Reza Oftadeh
Abstract:
The purpose of the article is to describe project objective structure (POS) concept that was developed on research activities and experiences about project management, Balanced Scorecard (BSC) and European Foundation Quality Management Excellence Model (EFQM Excellence Model). Furthermore, this paper tries to define a balanced, systematic, and integrated measurement approach to meet project objectives and project strategic goals based on a process-oriented model. In this paper, POS is suggested in order to measure project performance in the project life cycle. After using the POS model, the project manager can ensure in order to achieve the project objectives on the project charter. This concept can help project managers to implement integrated and balanced monitoring and control project work.Keywords: project objectives, project performance management, PMBOK, key performance indicators, integration management
Procedia PDF Downloads 37813557 PredictionSCMS: The Implementation of an AI-Powered Supply Chain Management System
Authors: Ioannis Andrianakis, Vasileios Gkatas, Nikos Eleftheriadis, Alexios Ellinidis, Ermioni Avramidou
Abstract:
The paper discusses the main aspects involved in the development of a supply chain management system using the newly developed PredictionSCMS software as a basis for the discussion. The discussion is focused on three topics: the first is demand forecasting, where we present the predictive algorithms implemented and discuss related concepts such as the calculation of the safety stock, the effect of out-of-stock days etc. The second topic concerns the design of a supply chain, where the core parameters involved in the process are given, together with a methodology of incorporating these parameters in a meaningful order creation strategy. Finally, the paper discusses some critical events that can happen during the operation of a supply chain management system and how the developed software notifies the end user about their occurrence.Keywords: demand forecasting, machine learning, risk management, supply chain design
Procedia PDF Downloads 9613556 Fault Analysis of Induction Machine Using Finite Element Method (FEM)
Authors: Wiem Zaabi, Yemna Bensalem, Hafedh Trabelsi
Abstract:
The paper presents a finite element (FE) based efficient analysis procedure for induction machine (IM). The FE formulation approaches are proposed to achieve this goal: the magnetostatic and the non-linear transient time stepped formulations. The study based on finite element models offers much more information on the phenomena characterizing the operation of electrical machines than the classical analytical models. This explains the increase of the interest for the finite element investigations in electrical machines. Based on finite element models, this paper studies the influence of the stator and the rotor faults on the behavior of the IM. In this work, a simple dynamic model for an IM with inter-turn winding fault and a broken bar fault is presented. This fault model is used to study the IM under various fault conditions and severity. The simulation results are conducted to validate the fault model for different levels of fault severity. The comparison of the results obtained by simulation tests allowed verifying the precision of the proposed FEM model. This paper presents a technical method based on Fast Fourier Transform (FFT) analysis of stator current and electromagnetic torque to detect the faults of broken rotor bar. The technique used and the obtained results show clearly the possibility of extracting signatures to detect and locate faults.Keywords: Finite element Method (FEM), Induction motor (IM), short-circuit fault, broken rotor bar, Fast Fourier Transform (FFT) analysis
Procedia PDF Downloads 30113555 Influence of Building Orientation and Post Processing Materials on Mechanical Properties of 3D-Printed Parts
Authors: Raf E. Ul Shougat, Ezazul Haque Sabuz, G. M. Najmul Quader, Monon Mahboob
Abstract:
Since there are lots of ways for building and post processing of parts or models in 3D printing technology, the main objective of this research is to provide an understanding how mechanical characteristics of 3D printed parts get changed for different building orientations and infiltrates. Tensile, compressive, flexure, and hardness test were performed for the analysis of mechanical properties of those models. Specimens were designed in CAD software, printed on Z-printer 450 with five different build orientations and post processed with four different infiltrates. Results show that with the change of infiltrates or orientations each of the above mechanical property changes and for each infiltrate the highest tensile strength, flexural strength, and hardness are found for such orientation where there is the lowest number of layers while printing.Keywords: 3D printing, building orientations, infiltrates, mechanical characteristics, number of layers
Procedia PDF Downloads 28013554 An Investigation on Electric Field Distribution around 380 kV Transmission Line for Various Pylon Models
Authors: C. F. Kumru, C. Kocatepe, O. Arikan
Abstract:
In this study, electric field distribution analyses for three pylon models are carried out by a Finite Element Method (FEM) based software. Analyses are performed in both stationary and time domains to observe instantaneous values along with the effective ones. Considering the results of the study, different line geometries is considerably affecting the magnitude and distribution of electric field although the line voltages are the same. Furthermore, it is observed that maximum values of instantaneous electric field obtained in time domain analysis are quite higher than the effective ones in stationary mode. In consequence, electric field distribution analyses should be individually made for each different line model and the limit exposure values or distances to residential buildings should be defined according to the results obtained.Keywords: electric field, energy transmission line, finite element method, pylon
Procedia PDF Downloads 72813553 Diversification and Risk Management in Non-Profit Organisations: A Case Study
Authors: Manzurul Alam, John Griffiths, David Holloway, Megan Paull, Anne Clear
Abstract:
Background: This paper investigates the nature of risk management practices in non-profit organizations. It is argued here that the risk exposure of these organizations has increased as a result of their entrepreneurial activities. This study explores how a particular non-profit organization formulates its risk strategies in the face funding restrictions. Design/Method/Approach: The study adopts a case study approach to report the results on how a non-profit organization diversifies its activities, tackles risks arising from such activities and improves performance. Results: The findings show that the organization made structural adjustments and leadership changes which helped to adjust their risk strategies. It also reports the organizational processes to deal with risks arising from both related and unrelated diversification strategies. Implications: Any generalization from this case example needs to be taken with caution as there are significant differences between non-profit organizations operating in different sectors. Originality: The paper makes a significant contribution to the non-profit literature by highlighting the diversification strategies along with risk performance.Keywords: risk management, performance management, non-profit organizations, financial management
Procedia PDF Downloads 51513552 A Grey-Box Text Attack Framework Using Explainable AI
Authors: Esther Chiramal, Kelvin Soh Boon Kai
Abstract:
Explainable AI is a strong strategy implemented to understand complex black-box model predictions in a human-interpretable language. It provides the evidence required to execute the use of trustworthy and reliable AI systems. On the other hand, however, it also opens the door to locating possible vulnerabilities in an AI model. Traditional adversarial text attack uses word substitution, data augmentation techniques, and gradient-based attacks on powerful pre-trained Bidirectional Encoder Representations from Transformers (BERT) variants to generate adversarial sentences. These attacks are generally white-box in nature and not practical as they can be easily detected by humans e.g., Changing the word from “Poor” to “Rich”. We proposed a simple yet effective Grey-box cum Black-box approach that does not require the knowledge of the model while using a set of surrogate Transformer/BERT models to perform the attack using Explainable AI techniques. As Transformers are the current state-of-the-art models for almost all Natural Language Processing (NLP) tasks, an attack generated from BERT1 is transferable to BERT2. This transferability is made possible due to the attention mechanism in the transformer that allows the model to capture long-range dependencies in a sequence. Using the power of BERT generalisation via attention, we attempt to exploit how transformers learn by attacking a few surrogate transformer variants which are all based on a different architecture. We demonstrate that this approach is highly effective to generate semantically good sentences by changing as little as one word that is not detectable by humans while still fooling other BERT models.Keywords: BERT, explainable AI, Grey-box text attack, transformer
Procedia PDF Downloads 13713551 Evaluation Metrics for Machine Learning Techniques: A Comprehensive Review and Comparative Analysis of Performance Measurement Approaches
Authors: Seyed-Ali Sadegh-Zadeh, Kaveh Kavianpour, Hamed Atashbar, Elham Heidari, Saeed Shiry Ghidary, Amir M. Hajiyavand
Abstract:
Evaluation metrics play a critical role in assessing the performance of machine learning models. In this review paper, we provide a comprehensive overview of performance measurement approaches for machine learning models. For each category, we discuss the most widely used metrics, including their mathematical formulations and interpretation. Additionally, we provide a comparative analysis of performance measurement approaches for metric combinations. Our review paper aims to provide researchers and practitioners with a better understanding of performance measurement approaches and to aid in the selection of appropriate evaluation metrics for their specific applications.Keywords: evaluation metrics, performance measurement, supervised learning, unsupervised learning, reinforcement learning, model robustness and stability, comparative analysis
Procedia PDF Downloads 7313550 The Quality of Management: A Leadership Maturity Model to Leverage Complexity
Authors: Marlene Kuhn, Franziska Schäfer, Heiner Otten
Abstract:
Today´s production processes experience a constant increase in complexity paving new ways for progressive forms of leadership. In the customized production, individual customer requirements drive companies to adapt their manufacturing processes constantly while the pressure for smaller lot sizes, lower costs and faster lead times grows simultaneously. When production processes are becoming more dynamic and complex, the conventional quality management approaches show certain limitations. This paper gives an introduction to complexity science from a quality management perspective. By analyzing and evaluating different characteristics of complexity, the critical complexity parameters are identified and assessed. We found that the quality of leadership plays a crucial role when dealing with increasing complexity. Therefore, we developed a concept for qualitative leadership customized for the management within complex processes based on a maturity model. The maturity model was then applied in the industry to assess the leadership quality of several shop floor managers with a positive evaluation feedback. In result, the maturity model proved to be a sustainable approach to leverage the rising complexity in production processes more effectively.Keywords: maturity model, process complexity, quality of leadership, quality management
Procedia PDF Downloads 37013549 Big Data in Telecom Industry: Effective Predictive Techniques on Call Detail Records
Authors: Sara ElElimy, Samir Moustafa
Abstract:
Mobile network operators start to face many challenges in the digital era, especially with high demands from customers. Since mobile network operators are considered a source of big data, traditional techniques are not effective with new era of big data, Internet of things (IoT) and 5G; as a result, handling effectively different big datasets becomes a vital task for operators with the continuous growth of data and moving from long term evolution (LTE) to 5G. So, there is an urgent need for effective Big data analytics to predict future demands, traffic, and network performance to full fill the requirements of the fifth generation of mobile network technology. In this paper, we introduce data science techniques using machine learning and deep learning algorithms: the autoregressive integrated moving average (ARIMA), Bayesian-based curve fitting, and recurrent neural network (RNN) are employed for a data-driven application to mobile network operators. The main framework included in models are identification parameters of each model, estimation, prediction, and final data-driven application of this prediction from business and network performance applications. These models are applied to Telecom Italia Big Data challenge call detail records (CDRs) datasets. The performance of these models is found out using a specific well-known evaluation criteria shows that ARIMA (machine learning-based model) is more accurate as a predictive model in such a dataset than the RNN (deep learning model).Keywords: big data analytics, machine learning, CDRs, 5G
Procedia PDF Downloads 13913548 Human Resource Management from Islamic Perspective
Authors: Qamar Ul Haq, Talat Hussain, Mufti Fahad Ahmed Qureshi
Abstract:
From the Islamic perspective, managing human resource meets various challenges, especially in the modern organizations. The adoption of Western practices in various aspects of management have caused gaps in justice, trustworthy, responsibility and other values of workers in Muslim countries. Thus, the interference of Islamic principles in human resource management (HRM) can be considered as a great solution for treating employees fairly and justly. This research aims to examine the level of Islamic practices in HRM, in which includes recruitment and selection, training and development, career development, performance management and rewards. The paper will analyze the relationships between HRM practices and organizational justice which focus on three elements, which are distributive justice, procedure justice and interactional justice. The data will be collected from selected Malaysian Government-Linked Company (GLC). Convenience sampling will be used to select the respondents for completing questionnaires. This conceptual paper essentially provides organizations with effective ways of understanding and implementing HRM by using Islamic principles. It also can be used as guidance for decision-making and day-today HR activities and will help organization to face uncertainties in the business world as well.Keywords: human resource management, organizational justice, Islam, Islamic banking
Procedia PDF Downloads 44013547 Social Work Practice to Labour Welfare: A Proposed Model of Field Work Practicum and Role of Social Worker in India
Authors: Naeem Ahmed
Abstract:
Social work is a professional activity based on the approach of “helping people to help themselves” (Stroup). Social work education and practice both are based on humanitarian philosophy in which social workers try to increase the happiness of the society and to reduce the problems of society. Labour welfare is a specialised field of social work which especially focuses on welfare of organised and unorganised labour. In India labour is facing numerous problems in both organised and unorganised sectors because of ignorance, illiteracy, high rate of unemployment etc. In most of the Indian social work institutions we have this specialization with different names like Human Resource Management or Industrial Relation and Personnel Management or Industrial Relations and Labour Welfare or Industrial Social Work etc. Field work practice is integrated part of social work education curriculum in all specialised field. In India we have different field work practice models being followed in different institutions. The main objective of this paper is to prepare a universal field work practicum model in the field of labour welfare. This paper is exploratory in nature, researcher used personal experience and secondary data (model of field work practice in different institutions like Aligarh Muslim University, Pondicherry University, Central University of Karnataka, University of Lucknow, MJP Rohilkhand University Bareilly etc.) Researcher found that there is an immediate need to upgrade the curriculum or field work practice in this particular field, as more than 40 percent of total population engaged in either unorganised or organised sector (NSSO 2011-12) and they are not aware about their rights. In this way a social worker can play an important role in existing labour welfare facilities by making them aware.Keywords: field work, labour welfare, organised labour, social work practice, unorganised labour
Procedia PDF Downloads 40113546 Knowledge Management in a Combined/Joint Environment
Authors: Cory Cannon
Abstract:
In the current era of shrinking budgets, increasing amounts of worldwide natural disasters, state and non-state initiated conflicts within the world. The response has involved multinational coalitions to conduct effective military operations. The need for a Knowledge Management strategy when developing these coalitions have been overlooked in the past and the need for developing these accords early on will save time and help shape the way information and knowledge are transferred from the staff and action officers of the coalition to the decision-makers in order to make timely decisions within an ever changing environment. The aim of this paper is to show how Knowledge Management has developed within the United States military and how the transformation of working within a Combined/ Joint environment in both the Middle East and the Far East has improved relations between members of the coalitions as well as being more effective as a military force. These same principles could be applied to multinational corporations when dealing with cultures and decision-making processes.Keywords: civil-military, culture, joint environment, knowledge management
Procedia PDF Downloads 36413545 Change Management as a Critical Success Factor In E-Government initiatives
Authors: Mohammed Alassim
Abstract:
In 2014, a UN survey stated that: "The greatest challenge to the adoption of whole-of government, which fundamentally rests on increased collaboration, is resistance to change among government actors". Change management has experienced both theoretically and practically many transformation over the years. When organizations have to implement radical changes, they have to encounter a plethora of issues which leads to ineffective or inefficient implementation of change in most cases. 70% of change projects fail because of human issues. It has been cited that” most studies still show a 60-70% failure rate for organizational change projects — a statistic that has stayed constant from the 1970’s to the present.”. E-government involves not just technical change but cultural, policy, social and organizational evolution. Managing change and overcoming resistance to change is seen as crucial in the success of E-government projects. Resistance can be from different levels in the organization (top management, middle management or employees at operational levels). There can be many reasons for resistance including fear of change and insecurity, lack of knowledge and absence of commitment from management to implement the change. The purpose of this study is to conduct in-depth research to understand the process of change and to identify the critical factors that have led to resistance from employees at different levels (top management, Middle management and operational employees) during e-government initiatives in the public sector in Saudi Arabia. The study is based on qualitative and empirical research methods conducted in the public sector in the Kingdom of Saudi Arabia. This research will use triangulation in data method (interview, group discussion and document review). This research will contribute significantly to knowledge in this field and will identify the measures that can be taken to reduce resistance to change, Upon analysis recommendations or model will be offered which can enable decision makers in public sector in Saudi Arabia how to plan, implement and evaluate change in e-government initiatives via change management strategy.Keywords: change management, e-government, managing change, resistance to change
Procedia PDF Downloads 31513544 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores
Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay
Abstract:
Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.Keywords: retail stores, faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition
Procedia PDF Downloads 15613543 Statistically Accurate Synthetic Data Generation for Enhanced Traffic Predictive Modeling Using Generative Adversarial Networks and Long Short-Term Memory
Authors: Srinivas Peri, Siva Abhishek Sirivella, Tejaswini Kallakuri, Uzair Ahmad
Abstract:
Effective traffic management and infrastructure planning are crucial for the development of smart cities and intelligent transportation systems. This study addresses the challenge of data scarcity by generating realistic synthetic traffic data using the PeMS-Bay dataset, improving the accuracy and reliability of predictive modeling. Advanced synthetic data generation techniques, including TimeGAN, GaussianCopula, and PAR Synthesizer, are employed to produce synthetic data that replicates the statistical and structural characteristics of real-world traffic. Future integration of Spatial-Temporal Generative Adversarial Networks (ST-GAN) is planned to capture both spatial and temporal correlations, further improving data quality and realism. The performance of each synthetic data generation model is evaluated against real-world data to identify the best models for accurately replicating traffic patterns. Long Short-Term Memory (LSTM) networks are utilized to model and predict complex temporal dependencies within traffic patterns. This comprehensive approach aims to pinpoint areas with low vehicle counts, uncover underlying traffic issues, and inform targeted infrastructure interventions. By combining GAN-based synthetic data generation with LSTM-based traffic modeling, this study supports data-driven decision-making that enhances urban mobility, safety, and the overall efficiency of city planning initiatives.Keywords: GAN, long short-term memory, synthetic data generation, traffic management
Procedia PDF Downloads 2613542 Recycling in Bogotá: A SWOT Analysis of Three Associations to Evaluate the Integrating the Informal Sector into Solid Waste Management
Authors: Clara Inés Pardo Martínez
Abstract:
In emerging economies, recycling is an opportunity for the cities to increase the lifespan of sanitary landfills, reduce the costs of the solid waste management, decrease the environmental problems of the waste treatment through reincorporate waste in the productive cycle and protect and develop people’s livelihoods of informal waste pickers. However, few studies have analysed the possibilities and strategies to integrate formal and informal sectors in the solid waste management for the benefit of both. This study seek to make a strength, weakness, opportunity, and threat (SWOT) analysis in three recycling associations of Bogotá with the aim to understand and determine the situation of recycling from perspective of informal sector in its transition to enter as authorized waste providers. Data used in the analysis are derived from multiple strategies such as literature review, the Bogota’s recycling database, focus group meetings, governmental reports, national laws and regulations and specific interviews with key stakeholders. Results of this study show as the main stakeholders of formal and informal sector of waste management can identify the internal and internal conditions of recycling in Bogotá. Several strategies were designed based on the SWOTs determined, could be useful for Bogotá to advance and promote recycling as a key strategy for integrated sustainable waste management in the city.Keywords: Bogotá, recycling, solid waste management, SWOT analysis
Procedia PDF Downloads 40313541 Supply Chain Control and Inventory Management in Garment Industry
Authors: Nisa Nur Duman, Sümeyya Kiliç
Abstract:
In global competition conditions, survival of the plants by obtaining competitive advantage relies on the effective usage of existing sources. By this way, the plants can minimize their costs without losing their quality. They also take advantage took advantage on their competitors and enlarge customer portfolio by increasing profit margins. Changing structure of market and customer demands also change the structure of the competition between companies. Furthermore, competition is not only between the companies. By this manner, supply chain and supply chain management get importance by considering company performances. Companies that want to survive, search the ways of decreasing costs and the ways of meeting customer expectations. One of the important tools for reaching these goals is inventory managemet. The best inventory management system is meeting the demands by considering plant goals.Keywords: Supply chain, inventory management, apparel sector, garment industry
Procedia PDF Downloads 37013540 Development of Groundwater Management Model Using Groundwater Sustainability Index
Authors: S. S. Rwanga, J. M. Ndambuki, Y. Woyessa
Abstract:
Development of a groundwater management model is an important step in the exploitation and management of any groundwater aquifer as it assists in the long-term sustainable planning of the resource. The current study was conducted in Central Limpopo province of South Africa with the overall objective of determining how much water can be withdrawn from the aquifer without producing nonreversible impacts on the groundwater quantity, hence developing a model which can sustainably protect the aquifer. The development was done through the computation of Groundwater Sustainability Index (GSI). Values of GSI close to unity and above indicated overexploitation. In this study, an index of 0.8 was considered as overexploitation. The results indicated that there is potential for higher abstraction rates compared to the current abstraction rates. GSI approach can be used in the management of groundwater aquifer to sustainably develop the resource and also provides water managers and policy makers with fundamental information on where future water developments can be carried out.Keywords: development, groundwater, groundwater sustainability index, model
Procedia PDF Downloads 16913539 A Multi-Release Software Reliability Growth Models Incorporating Imperfect Debugging and Change-Point under the Simulated Testing Environment and Software Release Time
Authors: Sujit Kumar Pradhan, Anil Kumar, Vijay Kumar
Abstract:
The testing process of the software during the software development time is a crucial step as it makes the software more efficient and dependable. To estimate software’s reliability through the mean value function, many software reliability growth models (SRGMs) were developed under the assumption that operating and testing environments are the same. Practically, it is not true because when the software works in a natural field environment, the reliability of the software differs. This article discussed an SRGM comprising change-point and imperfect debugging in a simulated testing environment. Later on, we extended it in a multi-release direction. Initially, the software was released to the market with few features. According to the market’s demand, the software company upgraded the current version by adding new features as time passed. Therefore, we have proposed a generalized multi-release SRGM where change-point and imperfect debugging concepts have been addressed in a simulated testing environment. The failure-increasing rate concept has been adopted to determine the change point for each software release. Based on nine goodness-of-fit criteria, the proposed model is validated on two real datasets. The results demonstrate that the proposed model fits the datasets better. We have also discussed the optimal release time of the software through a cost model by assuming that the testing and debugging costs are time-dependent.Keywords: software reliability growth models, non-homogeneous Poisson process, multi-release software, mean value function, change-point, environmental factors
Procedia PDF Downloads 7413538 Management of Therapeutic Anticancer at Oran Teaching Hospital, Algeria
Authors: S. Boulenouar, M. Sefir, M. Benahmed
Abstract:
All facilities need medication and other pharmaceuticals for their operation. Management and supply is therefore to provide the different services of the facility goods and services in required quantity and quality. The permanent availability of drugs in the facilities is very difficult because most face many difficulties at the inventory management and drug supplies. Therefore, it is necessary for each health facility to know the causes for the malfunction of its management system to cope with them. It is in this context that we have undertaken to conduct this study to know the causes which should be taken into consideration by the concerned authorities to carry out their mission, which is to provide quality health care for the population. In terms of financial resources, the budget for medicines represents a significant part of the budget of the pharmacy. Our study shows that the share of the hospital budget reserved for the drugs procurement represent on average 70% of the budget of the pharmacy. The results show a state of lack of anticancer drugs at Oran teaching hospital. The analysis of the management process allowed us to know the level that the problem of stock-outs of anti-cancer drugs is at. Suggestions were made to that effect to improve the availability for these products and to respond better to the needs of patients.Keywords: anticancer drugs, health care facility, budget, hospital pharmacist, hospital service
Procedia PDF Downloads 44613537 Non-Linear Assessment of Chromatographic Lipophilicity and Model Ranking of Newly Synthesized Steroid Derivatives
Authors: Milica Karadzic, Lidija Jevric, Sanja Podunavac-Kuzmanovic, Strahinja Kovacevic, Anamarija Mandic, Katarina Penov Gasi, Marija Sakac, Aleksandar Okljesa, Andrea Nikolic
Abstract:
The present paper deals with chromatographic lipophilicity prediction of newly synthesized steroid derivatives. The prediction was achieved using in silico generated molecular descriptors and quantitative structure-retention relationship (QSRR) methodology with the artificial neural networks (ANN) approach. Chromatographic lipophilicity of the investigated compounds was expressed as retention factor value logk. For QSRR modeling, a feedforward back-propagation ANN with gradient descent learning algorithm was applied. Using the novel sum of ranking differences (SRD) method generated ANN models were ranked. The aim was to distinguish the most consistent QSRR model that can be found, and similarity or dissimilarity between the models that could be noticed. In this study, SRD was performed with average values of retention factor value logk as reference values. An excellent correlation between experimentally observed retention factor value logk and values predicted by the ANN was obtained with a correlation coefficient higher than 0.9890. Statistical results show that the established ANN models can be applied for required purpose. This article is based upon work from COST Action (TD1305), supported by COST (European Cooperation in Science and Technology).Keywords: artificial neural networks, liquid chromatography, molecular descriptors, steroids, sum of ranking differences
Procedia PDF Downloads 31913536 Machine Learning Techniques in Seismic Risk Assessment of Structures
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this work is to evaluate the advantages and disadvantages of various machine learning techniques in two key steps of seismic hazard and risk assessment of different types of structures. The first step is the development of ground-motion models, which are used for forecasting ground-motion intensity measures (IM) given source characteristics, source-to-site distance, and local site condition for future events. IMs such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available. Second, it is investigated how machine learning techniques could be beneficial for developing probabilistic seismic demand models (PSDMs), which provide the relationship between the structural demand responses (e.g., component deformations, accelerations, internal forces, etc.) and the ground motion IMs. In the risk framework, such models are used to develop fragility curves estimating exceeding probability of damage for pre-defined limit states, and therefore, control the reliability of the predictions in the risk assessment. In this study, machine learning algorithms like artificial neural network, random forest, and support vector machine are adopted and trained on the demand parameters to derive PSDMs for them. It is observed that such models can provide more accurate estimates of prediction in relatively shorter about of time compared to conventional methods. Moreover, they can be used for sensitivity analysis of fragility curves with respect to many modeling parameters without necessarily requiring more intense numerical response-history analysis.Keywords: artificial neural network, machine learning, random forest, seismic risk analysis, seismic hazard analysis, support vector machine
Procedia PDF Downloads 10613535 Non-Linear Regression Modeling for Composite Distributions
Authors: Mostafa Aminzadeh, Min Deng
Abstract:
Modeling loss data is an important part of actuarial science. Actuaries use models to predict future losses and manage financial risk, which can be beneficial for marketing purposes. In the insurance industry, small claims happen frequently while large claims are rare. Traditional distributions such as Normal, Exponential, and inverse-Gaussian are not suitable for describing insurance data, which often show skewness and fat tails. Several authors have studied classical and Bayesian inference for parameters of composite distributions, such as Exponential-Pareto, Weibull-Pareto, and Inverse Gamma-Pareto. These models separate small to moderate losses from large losses using a threshold parameter. This research introduces a computational approach using a nonlinear regression model for loss data that relies on multiple predictors. Simulation studies were conducted to assess the accuracy of the proposed estimation method. The simulations confirmed that the proposed method provides precise estimates for regression parameters. It's important to note that this approach can be applied to datasets if goodness-of-fit tests confirm that the composite distribution under study fits the data well. To demonstrate the computations, a real data set from the insurance industry is analyzed. A Mathematica code uses the Fisher information algorithm as an iteration method to obtain the maximum likelihood estimation (MLE) of regression parameters.Keywords: maximum likelihood estimation, fisher scoring method, non-linear regression models, composite distributions
Procedia PDF Downloads 3413534 Provisions for Risk in Islamic Banking and Finance in Comparison to the Conventional Banks in Malaysia
Authors: Rashid Masoud Ali Al-Mazrui, Ramadhani Mashaka Shabani
Abstract:
Islamic banks and financial institutions are exposed to the same risks as conventional banking. These risks include the rate return risk, credit or market risk, liquidity risk, and operational risk among others. However, being a financial institution that operates Islamic banking and finance operations, there is additional risk associated with its operations different from conventional finance, such as displacing commercial risk. They face Shari'ah compliance risks because of their failure to follow Shari'ah principles. To have proper mitigation and risk management, banks should have proper risk management policies to mitigate risks. This paper aims to study the risk management taken by Islamic banks in comparison with conventional banks. Also, the study evaluates the provisions for risk management taken by selected Islamic banks and conventional banks. The study employs qualitative analysis using secondary data by applying a content analysis approach with a sample size of 4 Islamic banks and four conventional banks ranging from 2010 to 2020. We find that these banks all use the same technique, except for the associated risk. The extra ways are used, but only for additional risks that are available to Islamic banking and finance.Keywords: emerging risk, risk management, Islamic banking, conventional bank
Procedia PDF Downloads 8313533 Equilibrium and Kinetic Studies of Lead Adsorption on Activated Carbon Derived from Mangrove Propagule Waste by Phosphoric Acid Activation
Authors: Widi Astuti, Rizki Agus Hermawan, Hariono Mukti, Nurul Retno Sugiyono
Abstract:
The removal of lead ion (Pb2+) from aqueous solution by activated carbon with phosphoric acid activation employing mangrove propagule as precursor was investigated in a batch adsorption system. Batch studies were carried out to address various experimental parameters including pH and contact time. The Langmuir and Freundlich models were able to describe the adsorption equilibrium, while the pseudo first order and pseudo second order models were used to describe kinetic process of Pb2+ adsorption. The results show that the adsorption data are seen in accordance with Langmuir isotherm model and pseudo-second order kinetic model.Keywords: activated carbon, adsorption, equilibrium, kinetic, lead, mangrove propagule
Procedia PDF Downloads 16713532 Diabetes Mellitus and Blood Glucose Variability Increases the 30-day Readmission Rate after Kidney Transplantation
Authors: Harini Chakkera
Abstract:
Background: Inpatient hyperglycemia is an established independent risk factor among several patient cohorts with hospital readmission. This has not been studied after kidney transplantation. Nearly one-third of patients who have undergone a kidney transplant reportedly experience 30-day readmission. Methods: Data on first-time solitary kidney transplantations were retrieved between September 2015 to December 2018. Information was linked to the electronic health record to determine a diagnosis of diabetes mellitus and extract glucometeric and insulin therapy data. Univariate logistic regression analysis and the XGBoost algorithm were used to predict 30-day readmission. We report the average performance of the models on the testing set on five bootstrapped partitions of the data to ensure statistical significance. Results: The cohort included 1036 patients who received kidney transplantation, and 224 (22%) experienced 30-day readmission. The machine learning algorithm was able to predict 30-day readmission with an average AUC of 77.3% (95% CI 75.30-79.3%). We observed statistically significant differences in the presence of pretransplant diabetes, inpatient-hyperglycemia, inpatient-hypoglycemia, and minimum and maximum glucose values among those with higher 30-day readmission rates. The XGBoost model identified the index admission length of stay, presence of hyper- and hypoglycemia and recipient and donor BMI values as the most predictive risk factors of 30-day readmission. Additionally, significant variations in the therapeutic management of blood glucose by providers were observed. Conclusions: Suboptimal glucose metrics during hospitalization after kidney transplantation is associated with an increased risk for 30-day hospital readmission. Optimizing the hospital blood glucose management, a modifiable factor, after kidney transplantation may reduce the risk of 30-day readmission.Keywords: kidney, transplant, diabetes, insulin
Procedia PDF Downloads 90