Search results for: Business Model
17001 Impact of Extended Enterprise Resource Planning in the Context of Cloud Computing on Industries and Organizations
Authors: Gholamreza Momenzadeh, Forough Nematolahi
Abstract:
The Extended Enterprise Resource Planning (ERPII) system usually requires massive amounts of storage space, powerful servers, and large upfront and ongoing investments to purchase and manage the software and the related hardware which are not affordable for organizations. In recent decades, organizations prefer to adapt their business structures with new technologies for remaining competitive in the world economy. Therefore, cloud computing (which is one of the tools of information technology (IT)) is a modern system that reveals the next-generation application architecture. Also, cloud computing has had some advantages that reduce costs in many ways such as: lower upfront costs for all computing infrastructure and lower cost of maintaining and supporting. On the other hand, traditional ERPII is not responding for huge amounts of data and relations between the organizations. In this study, based on a literature study, ERPII is investigated in the context of cloud computing where the organizations operate more efficiently. Also, ERPII conditions have a response to needs of organizations in large amounts of data and relations between the organizations.Keywords: extended enterprise resource planning, cloud computing, business process, enterprise information integration
Procedia PDF Downloads 22217000 A Model for Optimizing Inventory Replenishment and Shelf Space Management in Retail Industries
Authors: Nermine A. Harraz, Aliaa Abouali
Abstract:
The retail stores put up for sale multiple items while the spaces in the backroom and display areas constitute a scarce resource. Availability, volume, and location of the product displayed in the showroom influence the customer’s demand. Managing these operations individually will result in sub-optimal overall retail store’s profit; therefore, a non-linear integer programming model (NLIP) is developed to determine the inventory replenishment and shelf space allocation decisions that together maximize the retailer’s profit under shelf space and backroom storage constraints taking into consideration that the demand rate is positively dependent on the amount and location of items displayed in the showroom. The developed model is solved using LINGO® software. The NLIP model is implemented in a real world case study in a large retail outlet providing a large variety of products. The proposed model is validated and shows logical results when using the experimental data collected from the market.Keywords: retailing management, inventory replenishment, shelf space allocation, showroom, backroom
Procedia PDF Downloads 35416999 Enhancing Cloud Computing with Security Trust Model
Authors: John Ayoade
Abstract:
Cloud computing is a model that enables the delivery of on-demand computing resources such as networks, servers, storage, applications and services over the internet. Cloud Computing is a relatively growing concept that presents a good number of benefits for its users; however, it also raises some security challenges which may slow down its use. In this paper, we identify some of those security issues that can serve as barriers to realizing the full benefits that cloud computing can bring. One of the key security problems is security trust. A security trust model is proposed that can enhance the confidence that users need to fully trust the use of public and mobile cloud computing and maximize the potential benefits that they offer.Keywords: cloud computing, trust, security, certificate authority, PKI
Procedia PDF Downloads 48416998 Marginalized Two-Part Joint Models for Generalized Gamma Family of Distributions
Authors: Mohadeseh Shojaei Shahrokhabadi, Ding-Geng (Din) Chen
Abstract:
Positive continuous outcomes with a substantial number of zero values and incomplete longitudinal follow-up are quite common in medical cost data. To jointly model semi-continuous longitudinal cost data and survival data and to provide marginalized covariate effect estimates, a marginalized two-part joint model (MTJM) has been developed for outcome variables with lognormal distributions. In this paper, we propose MTJM models for outcome variables from a generalized gamma (GG) family of distributions. The GG distribution constitutes a general family that includes approximately all of the most frequently used distributions like the Gamma, Exponential, Weibull, and Log Normal. In the proposed MTJM-GG model, the conditional mean from a conventional two-part model with a three-parameter GG distribution is parameterized to provide the marginal interpretation for regression coefficients. In addition, MTJM-gamma and MTJM-Weibull are developed as special cases of MTJM-GG. To illustrate the applicability of the MTJM-GG, we applied the model to a set of real electronic health record data recently collected in Iran, and we provided SAS code for application. The simulation results showed that when the outcome distribution is unknown or misspecified, which is usually the case in real data sets, the MTJM-GG consistently outperforms other models. The GG family of distribution facilitates estimating a model with improved fit over the MTJM-gamma, standard Weibull, or Log-Normal distributions.Keywords: marginalized two-part model, zero-inflated, right-skewed, semi-continuous, generalized gamma
Procedia PDF Downloads 17616997 Different Sampling Schemes for Semi-Parametric Frailty Model
Authors: Nursel Koyuncu, Nihal Ata Tutkun
Abstract:
Frailty model is a survival model that takes into account the unobserved heterogeneity for exploring the relationship between the survival of an individual and several covariates. In the recent years, proposed survival models become more complex and this feature causes convergence problems especially in large data sets. Therefore selection of sample from these big data sets is very important for estimation of parameters. In sampling literature, some authors have defined new sampling schemes to predict the parameters correctly. For this aim, we try to see the effect of sampling design in semi-parametric frailty model. We conducted a simulation study in R programme to estimate the parameters of semi-parametric frailty model for different sample sizes, censoring rates under classical simple random sampling and ranked set sampling schemes. In the simulation study, we used data set recording 17260 male Civil Servants aged 40–64 years with complete 10-year follow-up as population. Time to death from coronary heart disease is treated as a survival-time and age, systolic blood pressure are used as covariates. We select the 1000 samples from population using different sampling schemes and estimate the parameters. From the simulation study, we concluded that ranked set sampling design performs better than simple random sampling for each scenario.Keywords: frailty model, ranked set sampling, efficiency, simple random sampling
Procedia PDF Downloads 21116996 An Approach for Modeling CMOS Gates
Authors: Spyridon Nikolaidis
Abstract:
A modeling approach for CMOS gates is presented based on the use of the equivalent inverter. A new model for the inverter has been developed using a simplified transistor current model which incorporates the nanoscale effects for the planar technology. Parametric expressions for the output voltage are provided as well as the values of the output and supply current to be compatible with the CCS technology. The model is parametric according the input signal slew, output load, transistor widths, supply voltage, temperature and process. The transistor widths of the equivalent inverter are determined by HSPICE simulations and parametric expressions are developed for that using a fitting procedure. Results for the NAND gate shows that the proposed approach offers sufficient accuracy with an average error in propagation delay about 5%.Keywords: CMOS gate modeling, inverter modeling, transistor current mode, timing model
Procedia PDF Downloads 42316995 Experiential Learning for Upholding Entrepreneurship Education: A Case Study from Egypt
Authors: Randa El Bedawy
Abstract:
Exchanging best practices in the scope of entrepreneurship education and the use of experiential learning approaches are growing lately at a very fast pace. Educators should be challenged to promote such a learning approach to bridge the gap between entrepreneurship students and the actual business work environment. The study aims to share best practices, experiences, and knowledge to support entrepreneurship education. The study is exploratory qualitative research based on a case study approach to demonstrate how experiential learning can be used for supporting learning effectiveness in entrepreneurship education through demonstrating a set of fourteen tasks that were used to engage practically the students who were studying a course of entrepreneurship at the American University in Cairo. The study sheds the light on the rational process of using experiential learning to endorse entrepreneurship education through the illustration of each task along with its learning outcomes. The study explores the benefits and obstacles that educators may face when implementing such an experiential approach. The results of the study confirm that developing an experiential learning approach based on constructing a set of well designed practical tasks that complement the overall intended learning outcomes has proven very effective for promoting the students’ learning of entrepreneurship education. However, good preparation for both educators and students is needed primarily to ensure the effective implementation of such an experiential learning approach.Keywords: business education, entrepreneurship, entrepreneurship education, experiential learning
Procedia PDF Downloads 16316994 Two-Stage Launch Vehicle Trajectory Modeling for Low Earth Orbit Applications
Authors: Assem M. F. Sallam, Ah. El-S. Makled
Abstract:
This paper presents a study on the trajectory of a two stage launch vehicle. The study includes dynamic responses of motion parameters as well as the variation of angles affecting the orientation of the launch vehicle (LV). LV dynamic characteristics including state vector variation with corresponding altitude and velocity for the different LV stages separation, as well as the angle of attack and flight path angles are also discussed. A flight trajectory study for the drop zone of first stage and the jettisoning of fairing are introduced in the mathematical modeling to study their effect. To increase the accuracy of the LV model, atmospheric model is used taking into consideration geographical location and the values of solar flux related to the date and time of launch, accurate atmospheric model leads to enhancement of the calculation of Mach number, which affects the drag force over the LV. The mathematical model is implemented on MATLAB based software (Simulink). The real available experimental data are compared with results obtained from the theoretical computation model. The comparison shows good agreement, which proves the validity of the developed simulation model; the maximum error noticed was generally less than 10%, which is a result that can lead to future works and enhancement to decrease this level of error.Keywords: launch vehicle modeling, launch vehicle trajectory, mathematical modeling, Matlab- Simulink
Procedia PDF Downloads 27616993 Calibration and Validation of the Aquacrop Model for Simulating Growth and Yield of Rain-Fed Sesame (Sesamum Indicum L.) Under Different Soil Fertility Levels in the Semi-arid Areas of Tigray, Ethiopia
Authors: Abadi Berhane, Walelign Worku, Berhanu Abrha, Gebre Hadgu
Abstract:
Sesame is an important oilseed crop in Ethiopia, which is the second most exported agricultural commodity next to coffee. However, there is poor soil fertility management and a research-led farming system for the crop. The AquaCrop model was applied as a decision-support tool, which performs a semi-quantitative approach to simulate the yield of crops under different soil fertility levels. The objective of this experiment was to calibrate and validate the AquaCrop model for simulating the growth and yield of sesame under different nitrogen fertilizer levels and to test the performance of the model as a decision-support tool for improved sesame cultivation in the study area. The experiment was laid out as a randomized complete block design (RCBD) in a factorial arrangement in the 2016, 2017, and 2018 main cropping seasons. In this experiment, four nitrogen fertilizer rates, 0, 23, 46, and 69 Kg/ha nitrogen, and three improved varieties (Setit-1, Setit-2, and Humera-1). In the meantime, growth, yield, and yield components of sesame were collected from each treatment. Coefficient of determination (R2), Root mean square error (RMSE), Normalized root mean square error (N-RMSE), Model efficiency (E), and Degree of agreement (D) were used to test the performance of the model. The results indicated that the AquaCrop model successfully simulated soil water content with R2 varying from 0.92 to 0.98, RMSE 6.5 to 13.9 mm, E 0.78 to 0.94, and D 0.95 to 0.99, and the corresponding values for AB also varied from 0.92 to 0.98, 0.33 to 0.54 tons/ha, 0.74 to 0.93, and 0.9 to 0.98, respectively. The results on the canopy cover of sesame also showed that the model acceptably simulated canopy cover with R2 varying from 0.95 to 0.99 and a RMSE of 5.3 to 8.6%. The AquaCrop model was appropriately calibrated to simulate soil water content, canopy cover, aboveground biomass, and sesame yield; the results indicated that the model adequately simulated the growth and yield of sesame under the different nitrogen fertilizer levels. The AquaCrop model might be an important tool for improved soil fertility management and yield enhancement strategies of sesame. Hence, the model might be applied as a decision-support tool in soil fertility management in sesame production.Keywords: aquacrop model, normalized water productivity, nitrogen fertilizer, canopy cover, sesame
Procedia PDF Downloads 7916992 Designing Inventory System with Constrained by Reducing Ordering Cost, Lead Time and Lost Sale Rate and Considering Random Disturbance in Ordering Quantity
Authors: Arezoo Heidary, Abolfazl Mirzazadeh, Aref Gholami-Qadikolaei
Abstract:
In the business environment it is very common that a lot received may not be equal to quantity ordered. in this work, a random disturbance in a received quantity is considered. It is assumed a maximum allowable limit for storage space and inventory investment.The impact of lead time and ordering cost reductions once they act dependently is also investigated. Further, considering a mixture of back order and lost sales for allowable shortage system, the effect of investment on reducing lost sale rate is analyzed. For the proposed control system, a Lagrangian method is applied in order to solve the problem and an algorithmic procedure is utilized to achieve optimal solution with the global minimum expected cost. Finally, proves on concavity and convexity of the model in the decision variables are shown.Keywords: stochastic inventory system, lead time, ordering cost, lost sale rate, inventory constraints, random disturbance
Procedia PDF Downloads 41916991 Physical Characterization of a Watershed for Correlation with Parameters of Thomas Hydrological Model and Its Application in Iber Hidrodinamic Model
Authors: Carlos Caro, Ernest Blade, Nestor Rojas
Abstract:
This study determined the relationship between basic geo-technical parameters and parameters of the hydro logical model Thomas for water balance of rural watersheds, as a methodological calibration application, applicable in distributed models as IBER model, which represents a distributed system simulation models for unsteady flow numerical free surface. There was an exploration in 25 points (on 15 sub) basin of Rio Piedras (Boy.) obtaining soil samples, to which geo-technical characterization was performed by laboratory tests. Thomas model has a physical characterization of the input area by only four parameters (a, b, c, d). Achieve measurable relationship between geo technical parameters and 4 values of hydro logical parameters helps to determine subsurface, underground and surface flow more agile manner. It is intended in this way to reach some solutions regarding limits initial model parameters on the basis of Thomas geo-technical characterization. In hydro geological models of rural watersheds, calibration is an important process in the characterization of the study area. This step can require a significant computational cost and time, especially if the initial values or parameters before calibration are outside of the geo-technical reality. A better approach in these initial values means optimization of these process through a geo-technical materials area, where is obtained an important approach to the study as in the starting range of variation for the calibration parameters.Keywords: distributed hydrology, hydrological and geotechnical characterization, Iber model
Procedia PDF Downloads 52216990 The Determinants of Enterprise Risk Management: Literature Review, and Future Research
Authors: Sylvester S. Horvey, Jones Mensah
Abstract:
The growing complexities and dynamics in the business environment have led to a new approach to risk management, known as enterprise risk management (ERM). ERM is a system and an approach to managing the risks of an organization in an integrated manner to achieve the corporate goals and strategic objectives. Regardless of the diversities in the business environment, ERM has become an essential factor in managing individual and business risks because ERM is believed to enhance shareholder value and firm growth. Despite the growing number of literature on ERM, the question about what factors drives ERM remains limited. This study provides a comprehensive literature review of the main factors that contribute to ERM implementation. Google Scholar was the leading search engine used to identify empirical literature, and the review spanned between 2000 and 2020. Articles published in Scimago journal ranking and Scopus were examined. Thirteen firm characteristics and sixteen articles were considered for the empirical review. Most empirical studies agreed that firm size, institutional ownership, industry type, auditor type, industrial diversification, earnings volatility, stock price volatility, and internal auditor had a positive relationship with ERM adoption, whereas firm size, institutional ownership, auditor type, and type of industry were mostly seen be statistically significant. Other factors such as financial leverage, profitability, asset opacity, international diversification, and firm complexity revealed an inconclusive result. The growing literature on ERM is not without limitations; hence, this study suggests that further research should examine ERM determinants within a new geographical context while considering a new and robust way of measuring ERM rather than relying on a simple proxy (dummy) for ERM measurement. Other firm characteristics such as organizational culture and context, corporate scandals and losses, and governance could be considered determinants of ERM adoption.Keywords: enterprise risk management, determinants, ERM adoption, literature review
Procedia PDF Downloads 17316989 Model Predictive Control with Unscented Kalman Filter for Nonlinear Implicit Systems
Authors: Takashi Shimizu, Tomoaki Hashimoto
Abstract:
A class of implicit systems is known as a more generalized class of systems than a class of explicit systems. To establish a control method for such a generalized class of systems, we adopt model predictive control method which is a kind of optimal feedback control with a performance index that has a moving initial time and terminal time. However, model predictive control method is inapplicable to systems whose all state variables are not exactly known. In other words, model predictive control method is inapplicable to systems with limited measurable states. In fact, it is usual that the state variables of systems are measured through outputs, hence, only limited parts of them can be used directly. It is also usual that output signals are disturbed by process and sensor noises. Hence, it is important to establish a state estimation method for nonlinear implicit systems with taking the process noise and sensor noise into consideration. To this purpose, we apply the model predictive control method and unscented Kalman filter for solving the optimization and estimation problems of nonlinear implicit systems, respectively. The objective of this study is to establish a model predictive control with unscented Kalman filter for nonlinear implicit systems.Keywords: optimal control, nonlinear systems, state estimation, Kalman filter
Procedia PDF Downloads 20216988 Deep Routing Strategy: Deep Learning based Intelligent Routing in Software Defined Internet of Things.
Authors: Zabeehullah, Fahim Arif, Yawar Abbas
Abstract:
Software Defined Network (SDN) is a next genera-tion networking model which simplifies the traditional network complexities and improve the utilization of constrained resources. Currently, most of the SDN based Internet of Things(IoT) environments use traditional network routing strategies which work on the basis of max or min metric value. However, IoT network heterogeneity, dynamic traffic flow and complexity demands intelligent and self-adaptive routing algorithms because traditional routing algorithms lack the self-adaptions, intelligence and efficient utilization of resources. To some extent, SDN, due its flexibility, and centralized control has managed the IoT complexity and heterogeneity but still Software Defined IoT (SDIoT) lacks intelligence. To address this challenge, we proposed a model called Deep Routing Strategy (DRS) which uses Deep Learning algorithm to perform routing in SDIoT intelligently and efficiently. Our model uses real-time traffic for training and learning. Results demonstrate that proposed model has achieved high accuracy and low packet loss rate during path selection. Proposed model has also outperformed benchmark routing algorithm (OSPF). Moreover, proposed model provided encouraging results during high dynamic traffic flow.Keywords: SDN, IoT, DL, ML, DRS
Procedia PDF Downloads 11016987 A Data Mining Approach for Analysing and Predicting the Bank's Asset Liability Management Based on Basel III Norms
Authors: Nidhin Dani Abraham, T. K. Sri Shilpa
Abstract:
Asset liability management is an important aspect in banking business. Moreover, the today’s banking is based on BASEL III which strictly regulates on the counterparty default. This paper focuses on prediction and analysis of counter party default risk, which is a type of risk occurs when the customers fail to repay the amount back to the lender (bank or any financial institutions). This paper proposes an approach to reduce the counterparty risk occurring in the financial institutions using an appropriate data mining technique and thus predicts the occurrence of NPA. It also helps in asset building and restructuring quality. Liability management is very important to carry out banking business. To know and analyze the depth of liability of bank, a suitable technique is required. For that a data mining technique is being used to predict the dormant behaviour of various deposit bank customers. Various models are implemented and the results are analyzed of saving bank deposit customers. All these data are cleaned using data cleansing approach from the bank data warehouse.Keywords: data mining, asset liability management, BASEL III, banking
Procedia PDF Downloads 55216986 Strategies Employed to Enhance Floriculture Production for Masvingo City Residents’ Livelihood Improvement
Authors: Jotham Mazhura
Abstract:
Floriculture production is an ideal project for sustainable horticultural production in Masvingo city.Gender links in collaboration with the embasy of Sweedenare supporting the floriculture project with the aim of improving residents livelihoods in the city.World trade in floriculture such as cut flowers,live ornamental plants and foliage continue to increase and there are recognised markets opportunities across the globe.Some specific opportunitiesin an interview discussion by the consultant appointed by gender links and embasy of Sweeden highlightedsome constraints and opportunities in the project of floriculture in Masvingo city.Based on the outcome of the scoping studies this research project developed and evaluated strategies for enhancing floriculture production in Masvingo city. A survey was therefore carried out by the researcher among the existing florists farmers in the city to determine strategies to be employed to improve floriculture production.The survey was conducted to twenty florists in the city.The sample was taken by using purposive sampling which is a sampling technique based on the certain considerations, hence there were some basic creteria in selecting samples. A questionnaire in this aspect was administered to the 20 florists to determine the essential strategies to be employed to enhance floriculture production.Each respondent was given data for the business strategies and asked to rank those strategies from the most to the least important.From the research findings the following were revealed out by the respondents that is capturing marketshare,establishment of of ownership of the project,the project manager to be innovative,the business should gain competitive strategic through generic strategies market development strategy and product development strategy. Based on the observation and structured interview with respondents the average of floriculture owners had similar strategies implemented on their business.The research proved that floriculture farmers use various strategies to keep their businesses running and succeding in achieving set goals.Therefore the ressearche who happens to be the project focal person became certain that it is edeal to emply a variety of of strategies to improve floriculture oproductionKeywords: florist, floriculture, strategy, livelihoods
Procedia PDF Downloads 8616985 Modeling and Optimization of a Microfluidic Electrochemical Cell for the Electro-Reduction of CO₂ to CH₃OH
Authors: Barzin Rajabloo, Martin Desilets
Abstract:
First, an electrochemical model for the reduction of CO₂ into CH₃OH is developed in which mass and charge transfer, reactions at the surface of the electrodes and fluid flow of the electrolyte are considered. This mathematical model is developed in COMSOL Multiphysics® where both secondary and tertiary current distribution interfaces are coupled to consider concentrations and potentials inside different parts of the cell. Constant reaction rates are assumed as the fitted parameters to minimize the error between experimental data and modeling results. The model is validated through a comparison with experimental data in terms of faradaic efficiency for production of CH₃OH, the current density in different applied cathode potentials as well as current density in different electrolyte flow rates. The comparison between model outputs and experimental measurements shows a good agreement. The model indicates the higher hydrogen evolution in comparison with CH₃OH production as well as mass transfer limitation caused by CO₂ concentration, which are consistent with findings in the literature. After validating the model, in the second part of the study, some design parameters of the cell, such as cathode geometry and catholyte/anolyte channel widths, are modified to reach better performance and higher faradaic efficiency of methanol production.Keywords: carbon dioxide, electrochemical reduction, methanol, modeling
Procedia PDF Downloads 10916984 Optimizing Organizational Performance: The Critical Role of Headcount Budgeting in Strategic Alignment and Financial Stability
Authors: Shobhit Mittal
Abstract:
Headcount budgeting stands as a pivotal element in organizational financial management, extending beyond traditional budgeting to encompass strategic resource allocation for workforce-related expenses. This process is integral to maintaining financial stability and fostering a productive workforce, requiring a comprehensive analysis of factors such as market trends, business growth projections, and evolving workforce skill requirements. It demands a collaborative approach, primarily involving Human Resources (HR) and finance departments, to align workforce planning with an organization's financial capabilities and strategic objectives. The dynamic nature of headcount budgeting necessitates continuous monitoring and adjustment in response to economic fluctuations, business strategy shifts, technological advancements, and market dynamics. Its significance in talent management is also highlighted, aligning financial planning with talent acquisition and retention strategies to ensure a competitive edge in the market. The consequences of incorrect headcount budgeting are explored, showing how it can lead to financial strain, operational inefficiencies, and hindered strategic objectives. Examining case studies like IBM's strategic workforce rebalancing and Microsoft's shift for long-term success, the importance of aligning headcount budgeting with organizational goals is underscored. These examples illustrate that effective headcount budgeting transcends its role as a financial tool, emerging as a strategic element crucial for an organization's success. This necessitates continuous refinement and adaptation to align with evolving business goals and market conditions, highlighting its role as a key driver in organizational success and sustainability.Keywords: strategic planning, fiscal budget, headcount planning, resource allocation, financial management, decision-making, operational efficiency, risk management, headcount budget
Procedia PDF Downloads 5016983 A Dynamic Neural Network Model for Accurate Detection of Masked Faces
Authors: Oladapo Tolulope Ibitoye
Abstract:
Neural networks have become prominent and widely engaged in algorithmic-based machine learning networks. They are perfect in solving day-to-day issues to a certain extent. Neural networks are computing systems with several interconnected nodes. One of the numerous areas of application of neural networks is object detection. This is a prominent area due to the coronavirus disease pandemic and the post-pandemic phases. Wearing a face mask in public slows the spread of the virus, according to experts’ submission. This calls for the development of a reliable and effective model for detecting face masks on people's faces during compliance checks. The existing neural network models for facemask detection are characterized by their black-box nature and large dataset requirement. The highlighted challenges have compromised the performance of the existing models. The proposed model utilized Faster R-CNN Model on Inception V3 backbone to reduce system complexity and dataset requirement. The model was trained and validated with very few datasets and evaluation results shows an overall accuracy of 96% regardless of skin tone.Keywords: convolutional neural network, face detection, face mask, masked faces
Procedia PDF Downloads 6816982 A Comparative Analysis of ARIMA and Threshold Autoregressive Models on Exchange Rate
Authors: Diteboho Xaba, Kolentino Mpeta, Tlotliso Qejoe
Abstract:
This paper assesses the in-sample forecasting of the South African exchange rates comparing a linear ARIMA model and a SETAR model. The study uses a monthly adjusted data of South African exchange rates with 420 observations. Akaike information criterion (AIC) and the Schwarz information criteria (SIC) are used for model selection. Mean absolute error (MAE), root mean squared error (RMSE) and mean absolute percentage error (MAPE) are error metrics used to evaluate forecast capability of the models. The Diebold –Mariano (DM) test is employed in the study to check forecast accuracy in order to distinguish the forecasting performance between the two models (ARIMA and SETAR). The results indicate that both models perform well when modelling and forecasting the exchange rates, but SETAR seemed to outperform ARIMA.Keywords: ARIMA, error metrices, model selection, SETAR
Procedia PDF Downloads 24416981 Organizational Innovativeness: Motivation in Employee’s Innovative Work Behaviors
Authors: P. T. Ngan
Abstract:
Purpose: The study aims to answer the question what are motivational conditions that have great influences on employees’ innovative work behaviors by investigating the case of SATAMANKULMA/ Anya Productions Ky in Kuopio, Finland. Design/methodology: The main methodology utilized was the qualitative single case study research, analysis was conducted with an adapted thematic content analysis procedure, created from empirical material that was collected through interviews, observation and document review. Findings: The paper highlights the significance of combining relevant synergistic extrinsic and intrinsic motivations into the organizational motivation system. The findings show that intrinsic drives are essential for the initiation phases while extrinsic drives are more important for the implementation phases of innovative work behaviors. The study also offers the IDEA motivation model-interpersonal relationships & networks, development opportunities, economic constituent and application supports as an ideal tool to optimize business performance. Practical limitations/ implications: The research was only conducted from the perspective of SATAMANKULMA/Anya Productions Ky, with five interviews, a few observations and with several reviewed documents. However, further research is required to include other stakeholders, such as the customers, partner companies etc. Also the study does not offer statistical validity of the findings; an extensive case study or a qualitative multiple case study is suggested to compare the findings and provide information as to whether IDEA model relevant in other types of firms. Originality/value: Neither the innovation nor the human resource management field provides a detailed overview of specific motivational conditions might use to stimulate innovative work behaviors of individual employees. This paper fills that void.Keywords: employee innovative work behaviors, extrinsic motivation, intrinsic motivation, organizational innovativeness
Procedia PDF Downloads 26716980 Business Intelligence Dashboard Solutions for Improving Decision Making Process: A Focus on Prostate Cancer
Authors: Mona Isazad Mashinchi, Davood Roshan Sangachin, Francis J. Sullivan, Dietrich Rebholz-Schuhmann
Abstract:
Background: Decision-making processes are nowadays driven by data, data analytics and Business Intelligence (BI). BI as a software platform can provide a wide variety of capabilities such as organization memory, information integration, insight creation and presentation capabilities. Visualizing data through dashboards is one of the BI solutions (for a variety of areas) which helps managers in the decision making processes to expose the most informative information at a glance. In the healthcare domain to date, dashboard presentations are more frequently used to track performance related metrics and less frequently used to monitor those quality parameters which relate directly to patient outcomes. Providing effective and timely care for patients and improving the health outcome are highly dependent on presenting and visualizing data and information. Objective: In this research, the focus is on the presentation capabilities of BI to design a dashboard for prostate cancer (PC) data that allows better decision making for the patients, the hospital and the healthcare system related to a cancer dataset. The aim of this research is to customize a retrospective PC dataset in a dashboard interface to give a better understanding of data in the categories (risk factors, treatment approaches, disease control and side effects) which matter most to patients as well as other stakeholders. By presenting the outcome in the dashboard we address one of the major targets of a value-based health care (VBHC) delivery model which is measuring the value and presenting the outcome to different actors in HC industry (such as patients and doctors) for a better decision making. Method: For visualizing the stored data to users, three interactive dashboards based on the PC dataset have been developed (using the Tableau Software) to provide better views to the risk factors, treatment approaches, and side effects. Results: Many benefits derived from interactive graphs and tables in dashboards which helped to easily visualize and see the patients at risk, better understanding the relationship between patient's status after treatment and their initial status before treatment, or to choose better decision about treatments with fewer side effects regarding patient status and etc. Conclusions: Building a well-designed and informative dashboard is related to three important factors including; the users, goals and the data types. Dashboard's hierarchies, drilling, and graphical features can guide doctors to better navigate through information. The features of the interactive PC dashboard not only let doctors ask specific questions and filter the results based on the key performance indicators (KPI) such as: Gleason Grade, Patient's Age and Status, but may also help patients to better understand different treatment outcomes, such as side effects during the time, and have an active role in their treatment decisions. Currently, we are extending the results to the real-time interactive dashboard that users (either patients and doctors) can easily explore the data by choosing preferred attribute and data to make better near real-time decisions.Keywords: business intelligence, dashboard, decision making, healthcare, prostate cancer, value-based healthcare
Procedia PDF Downloads 14116979 The Quality of Management: A Leadership Maturity Model to Leverage Complexity
Authors: Marlene Kuhn, Franziska Schäfer, Heiner Otten
Abstract:
Today´s production processes experience a constant increase in complexity paving new ways for progressive forms of leadership. In the customized production, individual customer requirements drive companies to adapt their manufacturing processes constantly while the pressure for smaller lot sizes, lower costs and faster lead times grows simultaneously. When production processes are becoming more dynamic and complex, the conventional quality management approaches show certain limitations. This paper gives an introduction to complexity science from a quality management perspective. By analyzing and evaluating different characteristics of complexity, the critical complexity parameters are identified and assessed. We found that the quality of leadership plays a crucial role when dealing with increasing complexity. Therefore, we developed a concept for qualitative leadership customized for the management within complex processes based on a maturity model. The maturity model was then applied in the industry to assess the leadership quality of several shop floor managers with a positive evaluation feedback. In result, the maturity model proved to be a sustainable approach to leverage the rising complexity in production processes more effectively.Keywords: maturity model, process complexity, quality of leadership, quality management
Procedia PDF Downloads 37016978 Simulation of a Fluid Catalytic Cracking Process
Authors: Sungho Kim, Dae Shik Kim, Jong Min Lee
Abstract:
Fluid catalytic cracking (FCC) process is one of the most important process in modern refinery indusrty. This paper focuses on the fluid catalytic cracking (FCC) process. As the FCC process is difficult to model well, due to its nonlinearities and various interactions between its process variables, rigorous process modeling of whole FCC plant is demanded for control and plant-wide optimization of the plant. In this study, a process design for the FCC plant includes riser reactor, main fractionator, and gas processing unit was developed. A reactor model was described based on four-lumped kinetic scheme. Main fractionator, gas processing unit and other process units are designed to simulate real plant data, using a process flowsheet simulator, Aspen PLUS. The custom reactor model was integrated with the process flowsheet simulator to develop an integrated process model.Keywords: fluid catalytic cracking, simulation, plant data, process design
Procedia PDF Downloads 45716977 Neuron Dynamics of Single-Compartment Traub Model for Hardware Implementations
Authors: J. C. Moctezuma, V. Breña-Medina, Jose Luis Nunez-Yanez, Joseph P. McGeehan
Abstract:
In this work we make a bifurcation analysis for a single compartment representation of Traub model, one of the most important conductance-based models. The analysis focus in two principal parameters: current and leakage conductance. Study of stable and unstable solutions are explored; also Hop-bifurcation and frequency interpretation when current varies is examined. This study allows having control of neuron dynamics and neuron response when these parameters change. Analysis like this is particularly important for several applications such as: tuning parameters in learning process, neuron excitability tests, measure bursting properties of the neuron, etc. Finally, a hardware implementation results were developed to corroborate these results.Keywords: Traub model, Pinsky-Rinzel model, Hopf bifurcation, single-compartment models, bifurcation analysis, neuron modeling
Procedia PDF Downloads 32316976 Analysing Competitive Advantage of IoT and Data Analytics in Smart City Context
Authors: Petra Hofmann, Dana Koniel, Jussi Luukkanen, Walter Nieminen, Lea Hannola, Ilkka Donoghue
Abstract:
The Covid-19 pandemic forced people to isolate and become physically less connected. The pandemic has not only reshaped people’s behaviours and needs but also accelerated digital transformation (DT). DT of cities has become an imperative with the outlook of converting them into smart cities in the future. Embedding digital infrastructure and smart city initiatives as part of normal design, construction, and operation of cities provides a unique opportunity to improve the connection between people. The Internet of Things (IoT) is an emerging technology and one of the drivers in DT. It has disrupted many industries by introducing different services and business models, and IoT solutions are being applied in multiple fields, including smart cities. As IoT and data are fundamentally linked together, IoT solutions can only create value if the data generated by the IoT devices is analysed properly. Extracting relevant conclusions and actionable insights by using established techniques, data analytics contributes significantly to the growth and success of IoT applications and investments. Companies must grasp DT and be prepared to redesign their offerings and business models to remain competitive in today’s marketplace. As there are many IoT solutions available today, the amount of data is tremendous. The challenge for companies is to understand what solutions to focus on and how to prioritise and which data to differentiate from the competition. This paper explains how IoT and data analytics can impact competitive advantage and how companies should approach IoT and data analytics to translate them into concrete offerings and solutions in the smart city context. The study was carried out as a qualitative, literature-based research. A case study is provided to validate the preservation of company’s competitive advantage through smart city solutions. The results of the research contribution provide insights into the different factors and considerations related to creating competitive advantage through IoT and data analytics deployment in the smart city context. Furthermore, this paper proposes a framework that merges the factors and considerations with examples of offerings and solutions in smart cities. The data collected through IoT devices, and the intelligent use of it, can create competitive advantage to companies operating in smart city business. Companies should take into consideration the five forces of competition that shape industries and pay attention to the technological, organisational, and external contexts which define factors for consideration of competitive advantages in the field of IoT and data analytics. Companies that can utilise these key assets in their businesses will most likely conquer the markets and have a strong foothold in the smart city business.Keywords: data analytics, smart cities, competitive advantage, internet of things
Procedia PDF Downloads 13316975 Manage an Acute Pain Unit based on the Balanced Scorecard
Authors: Helena Costa Oliveira, Carmem Oliveira, Rita Moutinho
Abstract:
The Balanced Scorecard (BSC) is a continuous strategic monitoring model focused not only on financial issues but also on internal processes, patients/users, and learning and growth. Initially dedicated to business management, it currently serves organizations of other natures - such as hospitals. This paper presents a BSC designed for a Portuguese Acute Pain Unit (APU). This study is qualitative and based on the experience of collaborators at the APU. The management of APU is based on four perspectives – users, internal processes, learning and growth, and financial and legal. For each perspective, there were identified strategic objectives, critical factors, lead indicators and initiatives. The strategic map of the APU outlining sustained strategic relations among strategic objectives. This study contributes to the development of research in the health management area as it explores how organizational insufficiencies and inconsistencies in this particular case can be addressed, through the identification of critical factors, to clearly establish core outcomes and initiatives to set up.Keywords: acute pain unit, balanced scorecard, hospital management, organizational performance, Portugal
Procedia PDF Downloads 14816974 Development of Groundwater Management Model Using Groundwater Sustainability Index
Authors: S. S. Rwanga, J. M. Ndambuki, Y. Woyessa
Abstract:
Development of a groundwater management model is an important step in the exploitation and management of any groundwater aquifer as it assists in the long-term sustainable planning of the resource. The current study was conducted in Central Limpopo province of South Africa with the overall objective of determining how much water can be withdrawn from the aquifer without producing nonreversible impacts on the groundwater quantity, hence developing a model which can sustainably protect the aquifer. The development was done through the computation of Groundwater Sustainability Index (GSI). Values of GSI close to unity and above indicated overexploitation. In this study, an index of 0.8 was considered as overexploitation. The results indicated that there is potential for higher abstraction rates compared to the current abstraction rates. GSI approach can be used in the management of groundwater aquifer to sustainably develop the resource and also provides water managers and policy makers with fundamental information on where future water developments can be carried out.Keywords: development, groundwater, groundwater sustainability index, model
Procedia PDF Downloads 16916973 Residual Life Estimation Based on Multi-Phase Nonlinear Wiener Process
Authors: Hao Chen, Bo Guo, Ping Jiang
Abstract:
Residual life (RL) estimation based on multi-phase nonlinear Wiener process was studied in this paper, which is significant for complicated products with small samples. Firstly, nonlinear Wiener model with random parameter was introduced and multi-phase nonlinear Wiener model was proposed to model degradation process of products that were nonlinear and separated into different phases. Then the multi-phase RL probability density function based on the presented model was derived approximately in a closed form and parameters estimation was achieved with the method of maximum likelihood estimation (MLE). Finally, the method was applied to estimate the RL of high voltage plus capacitor. Compared with the other three different models by log-likelihood function (Log-LF) and Akaike information criterion (AIC), the results show that the proposed degradation model can capture degradation process of high voltage plus capacitors in a better way and provide a more reliable result.Keywords: multi-phase nonlinear wiener process, residual life estimation, maximum likelihood estimation, high voltage plus capacitor
Procedia PDF Downloads 45316972 Improvement of Central Composite Design in Modeling and Optimization of Simulation Experiments
Authors: A. Nuchitprasittichai, N. Lerdritsirikoon, T. Khamsing
Abstract:
Simulation modeling can be used to solve real world problems. It provides an understanding of a complex system. To develop a simplified model of process simulation, a suitable experimental design is required to be able to capture surface characteristics. This paper presents the experimental design and algorithm used to model the process simulation for optimization problem. The CO2 liquefaction based on external refrigeration with two refrigeration circuits was used as a simulation case study. Latin Hypercube Sampling (LHS) was purposed to combine with existing Central Composite Design (CCD) samples to improve the performance of CCD in generating the second order model of the system. The second order model was then used as the objective function of the optimization problem. The results showed that adding LHS samples to CCD samples can help capture surface curvature characteristics. Suitable number of LHS sample points should be considered in order to get an accurate nonlinear model with minimum number of simulation experiments.Keywords: central composite design, CO2 liquefaction, latin hypercube sampling, simulation-based optimization
Procedia PDF Downloads 166