Search results for: weather research and forecasting (WRF) model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 37114

Search results for: weather research and forecasting (WRF) model

36184 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks

Authors: Wang Yichen, Haruka Yamashita

Abstract:

In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.

Keywords: recurrent neural network, players lineup, basketball data, decision making model

Procedia PDF Downloads 129
36183 Antecedents and Consequences of Organizational Intelligence in an R and D Organization

Authors: Akriti Srivastava, Soumi Awasthy

Abstract:

One of the disciplines that provoked increased interest in the importance of intelligence is the management and organization development literature. Organization intelligence is a key enabling force underlying many vital activities and processes dominating organizational life. Hence, the factors which lead to organizational intelligence and the result which comes out of the whole procedure is important to be understood with the understanding of OI. The focus of this research was to uncover potential antecedents and consequences of organizational intelligence, thus a non-experimental explanatory survey research design was used. A non-experimental research design is in which the manipulation of variables and randomization of samples are not present. The data was collected with the help of the questionnaire from 321 scientists from different laboratories of an R & D organization. Out of which 304 data were found suitable for the analysis. There were 194 males (age, M= 35.03, SD=7.63) and 110 females (age, M= 34.34, SD=8.44). This study tested a conceptual model linking antecedent variables (leadership and organizational culture) to organizational intelligence, followed by organizational innovational capability and organizational performance. Structural equation modeling techniques were used to analyze the hypothesized model. But, before that, confirmatory factor analysis of organizational intelligence scale was done which resulted in an insignificant model. Then, exploratory factor analysis was done which gave six factors for organizational intelligence scale. This structure was used throughout the study. Following this, the final analysis revealed relatively good fit of data to the hypothesized model with certain modifications. Leadership and organizational culture emerged out as the significant antecedents of organizational intelligence. Organizational innovational capability and organizational performance came out to be the consequent factors of organizational intelligence. But organizational intelligence did not predict organizational performance via organizational innovational capability. With this, additional significant pathway emerged out between leadership and organizational performance. The model offers a fresh and comprehensive view of the organizational intelligence. In this study, prior studies in related literature were reviewed to offer a basic framework of organizational intelligence. The study proved to be beneficial for organizational intelligence scholarship, seeing its importance in the competitive environment.

Keywords: leadership, organizational culture, organizational intelligence, organizational innovational capability

Procedia PDF Downloads 336
36182 Game of Funds: Efficiency and Policy Implications of the United Kingdom Research Excellence Framework

Authors: Boon Lee

Abstract:

Research publication is an essential output of universities because it not only promotes university recognition, it also receives government funding. The history of university research culture has been one of ‘publish or perish’ and universities have consistently encouraged their academics and researchers to produce research articles in reputable journals in order to maintain a level of competitiveness. In turn, the United Kingdom (UK) government funding is determined by the number and quality of research publications. This paper aims to investigate on whether more government funding leads to more quality papers. To that end, the paper employs a Network DEA model to evaluate the UK higher education performance over a period. Sources of efficiency are also determined via second stage regression analysis.

Keywords: efficiency, higher education, network data envelopment analysis, universities

Procedia PDF Downloads 112
36181 A Crop Growth Subroutine for Watershed Resources Management (WRM) Model

Authors: Kingsley Nnaemeka Ogbu, Constantine Mbajiorgu

Abstract:

Vegetation has a marked effect on runoff and has become an important component in hydrologic model. The watershed Resources Management (WRM) model, a process-based, continuous, distributed parameter simulation model developed for hydrologic and soil erosion studies at the watershed scale lack a crop growth component. As such, this model assumes a constant parameter values for vegetation and hydraulic parameters throughout the duration of hydrologic simulation. Our approach is to develop a crop growth algorithm based on the original plant growth model used in the Environmental Policy Integrated Climate Model (EPIC) model. This paper describes the development of a single crop growth model which has the capability of simulating all crops using unique parameter values for each crop. Simulated crop growth processes will reflect the vegetative seasonality of the natural watershed system. An existing model was employed for evaluating vegetative resistance by hydraulic and vegetative parameters incorporated into the WRM model. The improved WRM model will have the ability to evaluate the seasonal variation of the vegetative roughness coefficient with depth of flow and further enhance the hydrologic model’s capability for accurate hydrologic studies

Keywords: crop yield, roughness coefficient, PAR, WRM model

Procedia PDF Downloads 403
36180 Exploring Data Stewardship in Fog Networking Using Blockchain Algorithm

Authors: Ruvaitha Banu, Amaladhithyan Krishnamoorthy

Abstract:

IoT networks today solve various consumer problems, from home automation systems to aiding in driving autonomous vehicles with the exploration of multiple devices. For example, in an autonomous vehicle environment, multiple sensors are available on roads to monitor weather and road conditions and interact with each other to aid the vehicle in reaching its destination safely and timely. IoT systems are predominantly dependent on the cloud environment for data storage, and computing needs that result in latency problems. With the advent of Fog networks, some of this storage and computing is pushed to the edge/fog nodes, saving the network bandwidth and reducing the latency proportionally. Managing the data stored in these fog nodes becomes crucial as it might also store sensitive information required for a certain application. Data management in fog nodes is strenuous because Fog networks are dynamic in terms of their availability and hardware capability. It becomes more challenging when the nodes in the network also live a short span, detaching and joining frequently. When an end-user or Fog Node wants to access, read, or write data stored in another Fog Node, then a new protocol becomes necessary to access/manage the data stored in the fog devices as a conventional static way of managing the data doesn’t work in Fog Networks. The proposed solution discusses a protocol that acts by defining sensitivity levels for the data being written and read. Additionally, a distinct data distribution and replication model among the Fog nodes is established to decentralize the access mechanism. In this paper, the proposed model implements stewardship towards the data stored in the Fog node using the application of Reinforcement Learning so that access to the data is determined dynamically based on the requests.

Keywords: IoT, fog networks, data stewardship, dynamic access policy

Procedia PDF Downloads 57
36179 Modelling the Dynamics and Optimal Control Strategies of Terrorism within the Southern Borno State Nigeria

Authors: Lubem Matthew Kwaghkor

Abstract:

Terrorism, which remains one of the largest threats faced by various nations and communities around the world, including Nigeria, is the calculated use of violence to create a general climate of fear in a population to attain particular goals that might be political, religious, or economical. Several terrorist groups are currently active in Nigeria, leading to attacks on both civil and military targets. Among these groups, Boko Haram is the deadliest terrorist group operating majorly in Borno State. The southern part of Borno State in North-Eastern Nigeria has been plagued by terrorism, insurgency, and conflict for several years. Understanding the dynamics of terrorism is crucial for developing effective strategies to mitigate its impact on communities and to facilitate peace-building efforts. This research aims to develop a mathematical model that captures the dynamics of terrorism within the southern part of Borno State, Nigeria, capturing both government and local community intervention strategies as control measures in combating terrorism. A compartmental model of five nonlinear differential equations is formulated. The model analyses show that a feasible solution set of the model exists and is bounded. Stability analyses show that both the terrorism free equilibrium and the terrorism endermic equilibrium are asymptotically stable, making the model to have biological meaning. Optimal control theory will be employed to identify the most effective strategy to prevent or minimize acts of terrorism. The research outcomes are expected to contribute towards enhancing security and stability in Southern Borno State while providing valuable insights for policymakers, security agencies, and researchers. This is an ongoing research.

Keywords: modelling, terrorism, optimal control, susceptible, non-susceptible, community intervention

Procedia PDF Downloads 13
36178 Transforming Water-Energy-Gas Industry through Smart Metering and Blockchain Technology

Authors: Khoi A. Nguyen, Rodney A. Stewart, Hong Zhang

Abstract:

Advanced metering technologies coupled with informatics creates an opportunity to form digital multi-utility service providers. These providers will be able to concurrently collect a customers’ medium-high resolution water, electricity and gas demand data and provide user-friendly platforms to feed this information back to customers and supply/distribution utility organisations. With the emergence of blockchain technology, a new research area has been explored which helps bring this multi-utility service provider concept to a much higher level. This study aims at introducing a breakthrough system architecture where smart metering technology in water, energy, and gas (WEG) are combined with blockchain technology to provide customer a novel real-time consumption report and decentralized resource trading platform. A pilot study on 4 properties in Australia has been undertaken to demonstrate this system, where benefits for customers and utilities are undeniable.

Keywords: blockchain, digital multi-utility, end use, demand forecasting

Procedia PDF Downloads 169
36177 Numerical Modeling of the Depth-Averaged Flow over a Hill

Authors: Anna Avramenko, Heikki Haario

Abstract:

This paper reports the development and application of a 2D depth-averaged model. The main goal of this contribution is to apply the depth averaged equations to a wind park model in which the treatment of the geometry, introduced on the mathematical model by the mass and momentum source terms. The depth-averaged model will be used in future to find the optimal position of wind turbines in the wind park. K-E and 2D LES turbulence models were consider in this article. 2D CFD simulations for one hill was done to check the depth-averaged model in practise.

Keywords: depth-averaged equations, numerical modeling, CFD, wind park model

Procedia PDF Downloads 599
36176 Scale up of Isoniazid Preventive Therapy: A Quality Management Approach in Nairobi County, Kenya

Authors: E. Omanya, E. Mueni, G. Makau, M. Kariuki

Abstract:

HIV infection is the strongest risk factor for a person to develop TB. Isoniazid preventive therapy (IPT) for People Living with HIV (PLWHIV) not only reduces the individual patients’ risk of developing active TB but mitigates cross infection. In Kenya, IPT for six months was recommended through the National TB, Leprosy and Lung Disease Program to treat latent TB. In spite of this recommendation by the national government, uptake of IPT among PLHIV remained low in Kenya by the end of 2015. The USAID/Kenya and East Africa Afya Jijini project, which supports 42 TBHIV health facilities in Nairobi County, began addressing low uptake of IPT through Quality Improvement (QI) teams set up at the facility level. Quality is characterized by WHO as one of the four main connectors between health systems building blocks and health systems outputs. Afya Jijini implements the Kenya Quality Model for Health, which involves QI teams being formed at the county, sub-county and facility levels. The teams review facility performance to identify gaps in service delivery and use QI tools to monitor and improve performance. Afya Jijini supported the formation of these teams in 42 facilities and built the teams’ capacity to review data and use QI principles to identify and address performance gaps. When the QI teams began working on improving IPT uptake among PLHIV, uptake was at 31.8%. The teams first conducted a root cause analysis using cause and effect diagrams, which help the teams to brainstorm on and to identify barriers to IPT uptake among PLHIV at the facility level. This is a participatory process where program staff provides technical support to the QI teams in problem identification and problem-solving. The gaps identified were inadequate knowledge and skills on the use of IPT among health care workers, lack of awareness of IPT by patients, inadequate monitoring and evaluation tools, and poor quantification and forecasting of IPT commodities. In response, Afya Jijini trained over 300 health care workers on the administration of IPT, supported patient education, supported quantification and forecasting of IPT commodities, and provided IPT data collection tools to help facilities monitor their performance. The facility QI teams conducted monthly meetings to monitor progress on implementation of IPT and took corrective action when necessary. IPT uptake improved from 31.8% to 61.2% during the second year of the Afya Jijini project and improved to 80.1% during the third year of the project’s support. Use of QI teams and root cause analysis to identify and address service delivery gaps, in addition to targeted program interventions and continual performance reviews, can be successful in increasing TB related service delivery uptake at health facilities.

Keywords: isoniazid, quality, health care workers, people leaving with HIV

Procedia PDF Downloads 94
36175 Hybrid Model for Measuring the Hedge Strategy in Exchange Risk in Information Technology Industry

Authors: Yi-Hsien Wang, Fu-Ju Yang, Hwa-Rong Shen, Rui-Lin Tseng

Abstract:

The business is notably related to the market risk according to the increase of liberalization of financial markets. Hence, the company usually utilized high financial leverage of derivatives to hedge the risk. When the company choose different hedging instruments to face a variety of exchange rate risk, we employ the Multinomial Logistic-AHP to analyze the impact of various derivatives. Hence, the research summarized the literature on relevant factors affecting managers selected exchange rate hedging instruments, using Multinomial Logistic Model and and further integrate AHP. Using Experts’ Questionnaires can test multi-level selection and hedging effect of different hedging instruments in order to calculate the hedging instruments and the multi-level factors of weights to understand the gap between the empirical results and practical operation. Finally, the Multinomial Logistic-AHP Model will sort the weights to analyze. The research findings can be a basis reference for investors in decision-making.

Keywords: exchange rate risk, derivatives, hedge, multinomial logistic-AHP

Procedia PDF Downloads 441
36174 The Role of Synthetic Data in Aerial Object Detection

Authors: Ava Dodd, Jonathan Adams

Abstract:

The purpose of this study is to explore the characteristics of developing a machine learning application using synthetic data. The study is structured to develop the application for the purpose of deploying the computer vision model. The findings discuss the realities of attempting to develop a computer vision model for practical purpose, and detail the processes, tools, and techniques that were used to meet accuracy requirements. The research reveals that synthetic data represents another variable that can be adjusted to improve the performance of a computer vision model. Further, a suite of tools and tuning recommendations are provided.

Keywords: computer vision, machine learning, synthetic data, YOLOv4

Procedia PDF Downloads 220
36173 Flexible Mixed Model Assembly Line Design: A Strategy to Respond for Demand Uncertainty at Automotive Part Manufacturer in Indonesia

Authors: T. Yuri, M. Zagloel, Inaki M. Hakim, Tegu Bintang Nugraha

Abstract:

In an era of customer centricity, automotive parts manufacturer in Indonesia must be able to keep up with the uncertainty and fluctuation of consumer demand. Flexible Manufacturing System (FMS) is a strategy to react to predicted and unpredicted changes of demand in automotive industry. This research is about flexible mixed model assembly line design through Value Stream Mapping (VSM) and Line Balancing in mixed model assembly line prior to simulation. It uses value stream mapping to identify and reduce waste while finding the best position to add or reduce manpower. Line balancing is conducted to minimize or maximize production rate while increasing assembly line productivity and efficiency. Results of this research is a recommendation of standard work combination for specifics demand scenario which can enhance assembly line efficiency and productivity.

Keywords: automotive industry, demand uncertainty, flexible assembly system, line balancing, value stream mapping

Procedia PDF Downloads 324
36172 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models

Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg

Abstract:

Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.

Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction

Procedia PDF Downloads 306
36171 Eliminating Injury in the Work Place and Realizing Vision Zero Using Accident Investigation and Analysis as Method: A Case Study

Authors: Ramesh Kumar Behera, Md. Izhar Hassan

Abstract:

Accident investigation and analysis are useful to identify deficiencies in plant, process, and management practices and formulate preventive strategies for injury elimination. In India and other parts of the world, industrial accidents are investigated to know the causes and also to fulfill legal compliances. However, findings of investigation are seldom used appropriately to strengthen Occupational Safety and Health (OSH) in expected lines. The mineral rich state of Odisha in eastern coast of India; known as a hub for Iron and Steel industries, witnessed frequent accidents during 2005-2009. This article based on study of 982 fatal ‘factory-accidents’ occurred in Odisha during the period 2001-2016, discusses the ‘turnaround-story’ resulting in reduction of fatal accident from 122 in 2009 to 45 in 2016. This paper examines various factors causing incidents; accident pattern in steel and chemical sector; role of climate and harsh weather conditions on accident causation. Software such as R, SQL, MS-Excel and Tableau were used for analysis of data. It is found that maximum fatality is caused due to ‘fall from height’ (24%); steel industries are relatively more accident prone; harsh weather conditions of summer increase chances of accident by 20%. Further, the study suggests that enforcement of partial work-restriction around lunch time during peak summer, screening and training of employees reduce accidents due to fall from height. The study indicates that learning from accident investigation and analysis can be used as a method to reduce work related accidents in the journey towards ‘Vision Zero’.

Keywords: accident investigation and analysis, fatal accidents in India, fall from height, vision zero

Procedia PDF Downloads 150
36170 Major Sucking Pests of Rose and Their Seasonal Abundance in Bangladesh

Authors: Md Ruhul Amin

Abstract:

This study was conducted in the experimental field of the Department of Entomology, Bangabandhu Sheikh Mujibur Rahman Agricultural University, Gazipur, Bangladesh during November 2017 to May 2018 with a view to understanding the seasonal abundance of the major sucking pests namely thrips, aphid and red spider mite on rose. The findings showed that the thrips started to build up their population from the middle of January with abundance 1.0 leaf⁻¹, increased continuously, reached to the peak level (2.6 leaf⁻¹) in the middle of February and then declined. Aphid started to build up their population from the second week of November with abundance 6.0 leaf⁻¹, increased continuously, reached to the peak level (8.4 leaf⁻¹) in the last week of December and then declined. Mite started to build up their population from the first week of December with abundance 0.8 leaf⁻¹, increased continuously, reached to the peak level (8.2 leaf⁻¹) in the second week of March and then declined. Thrips and mite prevailed until the last week of April, and aphid showed their abundance till last week of May. The daily mean temperature, relative humidity, and rainfall had an insignificant negative correlation with thrips and significant negative correlation with aphid abundance. The daily mean temperature had significant positive, relative humidity had an insignificant positive, and rainfall had an insignificant negative correlation with mite abundance. The multiple linear regression analysis showed that the weather parameters together contributed 38.1, 41.0 and 8.9% abundance on thrips, aphid and mite on rose, respectively and the equations were insignificant.

Keywords: aphid, mite, thrips, weather factors

Procedia PDF Downloads 159
36169 Administration Model for the College of Film, Television, Multimedia and Performing Arts, Suan Sunandha Rajabhat University

Authors: Somdech Rungsrisawat

Abstract:

The objective of this research was to investigate how to develop an appropriate management and administration model for the College of Film, Television, Multimedia and Performing Arts at Suan Sunandha Rajabhat University. A combination of qualitative and quantitative data collection and analysis methods was employed. The data collection was from the 8 experts who were the academic staff and entrepreneurs in films, television, multimedia and performing arts, and from 471 students studying in the communication arts field. The findings of this research paper presented the appropriate management and administration model for the College of Film, Television, Multimedia and Performing Arts, which depended on 3 factors: [i] the marketing management and the supporting facilities such as buildings, equipments and accessibility for students to the college; [ii] the competency of academic staff or lecturers and supporting staff; and [iii] career opportunities after graduation.

Keywords: educational institution management, educational management, learning resources, non-formal education, Thai qualifications framework for higher education

Procedia PDF Downloads 323
36168 A Predictive Machine Learning Model of the Survival of Female-led and Co-Led Small and Medium Enterprises in the UK

Authors: Mais Khader, Xingjie Wei

Abstract:

This research sheds light on female entrepreneurs by providing new insights on the survival predictions of companies led by females in the UK. This study aims to build a predictive machine learning model of the survival of female-led & co-led small & medium enterprises (SMEs) in the UK over the period 2000-2020. The predictive model built utilised a combination of financial and non-financial features related to both companies and their directors to predict SMEs' survival. These features were studied in terms of their contribution to the resultant predictive model. Five machine learning models are used in the modelling: Decision tree, AdaBoost, Naïve Bayes, Logistic regression and SVM. The AdaBoost model had the highest performance of the five models, with an accuracy of 73% and an AUC of 80%. The results show high feature importance in predicting companies' survival for company size, management experience, financial performance, industry, region, and females' percentage in management.

Keywords: company survival, entrepreneurship, females, machine learning, SMEs

Procedia PDF Downloads 96
36167 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling

Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed

Abstract:

The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.

Keywords: streamflow, neural network, optimisation, algorithm

Procedia PDF Downloads 150
36166 UBCSAND Model Calibration for Generic Liquefaction Triggering Curves

Authors: Jui-Ching Chou

Abstract:

Numerical simulation is a popular method used to evaluate the effects of soil liquefaction on a structure or the effectiveness of a mitigation plan. Many constitutive models (UBCSAND model, PM4 model, SANISAND model, etc.) were presented to model the liquefaction phenomenon. In general, inputs of a constitutive model need to be calibrated against the soil cyclic resistance before being applied to the numerical simulation model. Then, simulation results can be compared with results from simplified liquefaction potential assessing methods. In this article, inputs of the UBCSAND model, a simple elastic-plastic stress-strain model, are calibrated against several popular generic liquefaction triggering curves of simplified liquefaction potential assessing methods via FLAC program. Calibrated inputs can provide engineers to perform a preliminary evaluation of an existing structure or a new design project.

Keywords: calibration, liquefaction, numerical simulation, UBCSAND Model

Procedia PDF Downloads 165
36165 Modeling and Optimization of Performance of Four Stroke Spark Ignition Injector Engine

Authors: A. A. Okafor, C. H. Achebe, J. L. Chukwuneke, C. G. Ozoegwu

Abstract:

The performance of an engine whose basic design parameters are known can be predicted with the assistance of simulation programs into the less time, cost and near value of actual. This paper presents a comprehensive mathematical model of the performance parameters of four stroke spark ignition engine. The essence of this research work is to develop a mathematical model for the analysis of engine performance parameters of four stroke spark ignition engine before embarking on full scale construction, this will ensure that only optimal parameters are in the design and development of an engine and also allow to check and develop the design of the engine and it’s operation alternatives in an inexpensive way and less time, instead of using experimental method which requires costly research test beds. To achieve this, equations were derived which describe the performance parameters (sfc, thermal efficiency, mep and A/F). The equations were used to simulate and optimize the engine performance of the model for various engine speeds. The optimal values obtained for the developed bivariate mathematical models are: sfc is 0.2833kg/kwh, efficiency is 28.77% and a/f is 20.75.

Keywords: bivariate models, engine performance, injector engine, optimization, performance parameters, simulation, spark ignition

Procedia PDF Downloads 320
36164 Long- and Short-Term Impacts of COVID-19 and Gold Price on Price Volatility: A Comparative Study of MIDAS and GARCH-MIDAS Models for USA Crude Oil

Authors: Samir K. Safi

Abstract:

The purpose of this study was to compare the performance of two types of models, namely MIDAS and MIDAS-GARCH, in predicting the volatility of crude oil returns based on gold price returns and the COVID-19 pandemic. The study aimed to identify which model would provide more accurate short-term and long-term predictions and which model would perform better in handling the increased volatility caused by the pandemic. The findings of the study revealed that the MIDAS model performed better in predicting short-term and long-term volatility before the pandemic, while the MIDAS-GARCH model performed significantly better in handling the increased volatility caused by the pandemic. The study highlights the importance of selecting appropriate models to handle the complexities of real-world data and shows that the choice of model can significantly impact the accuracy of predictions. The practical implications of model selection and exploring potential methodological adjustments for future research will be highlighted and discussed.

Keywords: GARCH-MIDAS, MIDAS, crude oil, gold, COVID-19, volatility

Procedia PDF Downloads 62
36163 Development of an Optimised, Automated Multidimensional Model for Supply Chains

Authors: Safaa H. Sindi, Michael Roe

Abstract:

This project divides supply chain (SC) models into seven Eras, according to the evolution of the market’s needs throughout time. The five earliest Eras describe the emergence of supply chains, while the last two Eras are to be created. Research objectives: The aim is to generate the two latest Eras with their respective models that focus on the consumable goods. Era Six contains the Optimal Multidimensional Matrix (OMM) that incorporates most characteristics of the SC and allocates them into four quarters (Agile, Lean, Leagile, and Basic SC). This will help companies, especially (SMEs) plan their optimal SC route. Era Seven creates an Automated Multidimensional Model (AMM) which upgrades the matrix of Era six, as it accounts for all the supply chain factors (i.e. Offshoring, sourcing, risk) into an interactive system with Heuristic Learning that helps larger companies and industries to select the best SC model for their market. Methodologies: The data collection is based on a Fuzzy-Delphi study that analyses statements using Fuzzy Logic. The first round of Delphi study will contain statements (fuzzy rules) about the matrix of Era six. The second round of Delphi contains the feedback given from the first round and so on. Preliminary findings: both models are applicable, Matrix of Era six reduces the complexity of choosing the best SC model for SMEs by helping them identify the best strategy of Basic SC, Lean, Agile and Leagile SC; that’s tailored to their needs. The interactive heuristic learning in the AMM of Era seven will help mitigate error and aid large companies to identify and re-strategize the best SC model and distribution system for their market and commodity, hence increasing efficiency. Potential contributions to the literature: The problematic issue facing many companies is to decide which SC model or strategy to incorporate, due to the many models and definitions developed over the years. This research simplifies this by putting most definition in a template and most models in the Matrix of era six. This research is original as the division of SC into Eras, the Matrix of Era six (OMM) with Fuzzy-Delphi and Heuristic Learning in the AMM of Era seven provides a synergy of tools that were not combined before in the area of SC. Additionally the OMM of Era six is unique as it combines most characteristics of the SC, which is an original concept in itself.

Keywords: Leagile, automation, heuristic learning, supply chain models

Procedia PDF Downloads 384
36162 Revolutionizing Financial Forecasts: Enhancing Predictions with Graph Convolutional Networks (GCN) - Long Short-Term Memory (LSTM) Fusion

Authors: Ali Kazemi

Abstract:

Those within the volatile and interconnected international economic markets, appropriately predicting market trends, hold substantial fees for traders and financial establishments. Traditional device mastering strategies have made full-size strides in forecasting marketplace movements; however, monetary data's complicated and networked nature calls for extra sophisticated processes. This observation offers a groundbreaking method for monetary marketplace prediction that leverages the synergistic capability of Graph Convolutional Networks (GCNs) and Long Short-Term Memory (LSTM) networks. Our suggested algorithm is meticulously designed to forecast the traits of inventory market indices and cryptocurrency costs, utilizing a comprehensive dataset spanning from January 1, 2015, to December 31, 2023. This era, marked by sizable volatility and transformation in financial markets, affords a solid basis for schooling and checking out our predictive version. Our algorithm integrates diverse facts to construct a dynamic economic graph that correctly reflects market intricacies. We meticulously collect opening, closing, and high and low costs daily for key inventory marketplace indices (e.g., S&P 500, NASDAQ) and widespread cryptocurrencies (e.g., Bitcoin, Ethereum), ensuring a holistic view of marketplace traits. Daily trading volumes are also incorporated to seize marketplace pastime and liquidity, providing critical insights into the market's shopping for and selling dynamics. Furthermore, recognizing the profound influence of the monetary surroundings on financial markets, we integrate critical macroeconomic signs with hobby fees, inflation rates, GDP increase, and unemployment costs into our model. Our GCN algorithm is adept at learning the relational patterns amongst specific financial devices represented as nodes in a comprehensive market graph. Edges in this graph encapsulate the relationships based totally on co-movement styles and sentiment correlations, enabling our version to grasp the complicated community of influences governing marketplace moves. Complementing this, our LSTM algorithm is trained on sequences of the spatial-temporal illustration discovered through the GCN, enriched with historic fee and extent records. This lets the LSTM seize and expect temporal marketplace developments accurately. Inside the complete assessment of our GCN-LSTM algorithm across the inventory marketplace and cryptocurrency datasets, the version confirmed advanced predictive accuracy and profitability compared to conventional and opportunity machine learning to know benchmarks. Specifically, the model performed a Mean Absolute Error (MAE) of 0.85%, indicating high precision in predicting day-by-day charge movements. The RMSE was recorded at 1.2%, underscoring the model's effectiveness in minimizing tremendous prediction mistakes, which is vital in volatile markets. Furthermore, when assessing the model's predictive performance on directional market movements, it achieved an accuracy rate of 78%, significantly outperforming the benchmark models, averaging an accuracy of 65%. This high degree of accuracy is instrumental for techniques that predict the course of price moves. This study showcases the efficacy of mixing graph-based totally and sequential deep learning knowledge in economic marketplace prediction and highlights the fee of a comprehensive, records-pushed evaluation framework. Our findings promise to revolutionize investment techniques and hazard management practices, offering investors and economic analysts a powerful device to navigate the complexities of cutting-edge economic markets.

Keywords: financial market prediction, graph convolutional networks (GCNs), long short-term memory (LSTM), cryptocurrency forecasting

Procedia PDF Downloads 58
36161 An Investigation of Customer Relationship Management of Tourism

Authors: Wanida Suwunniponth

Abstract:

This research paper aimed to developing a causal relationship model of success factors of customer relationship management of tourism in Thailand and to investigating relationships among the potential factors that facilitate the success of customer relationship management (CRM). The research was conducted in both quantitative and qualitative methods, by utilizing both questionnaire and in-depth interview. The questionnaire was used in collecting the data from 250 management staff in the hotels located within Bangkok area. Sampling techniques used in this research included cluster sampling according to the service quality and simple random sampling. The data input was analyzed by use of descriptive analysis and System Equation Model (SEM). The research findings demonstrated important factors accentuated by most respondents towards the success of CRM, which were organization, people, information technology and the process of CRM. Moreover, the customer relationship management of tourism business in Thailand was found to be successful at a very significant level. The hypothesis testing showed that the hypothesis was accepted, as the factors concerning with organization, people and information technology played an influence on the process and the success of customer relationship management, whereas the process of customer relationship management factor manipulated its success. The findings suggested that tourism business in Thailand with the implementation of customer relationship management should opt in improvement approach in terms of managerial structure, corporate culture building with customer- centralized approach accentuated, and investment of information technology and customer analysis, in order to capacitate higher efficiency of customer relationship management process that would result in customer satisfaction and retention of service.

Keywords: customer relationship management, casual relationship model, tourism, Thailand

Procedia PDF Downloads 329
36160 A Crop Growth Subroutine for Watershed Resources Management (WRM) Model 1: Description

Authors: Kingsley Nnaemeka Ogbu, Constantine Mbajiorgu

Abstract:

Vegetation has a marked effect on runoff and has become an important component in hydrologic model. The watershed Resources Management (WRM) model, a process-based, continuous, distributed parameter simulation model developed for hydrologic and soil erosion studies at the watershed scale lack a crop growth component. As such, this model assumes a constant parameter values for vegetation and hydraulic parameters throughout the duration of hydrologic simulation. Our approach is to develop a crop growth algorithm based on the original plant growth model used in the Environmental Policy Integrated Climate Model (EPIC) model. This paper describes the development of a single crop growth model which has the capability of simulating all crops using unique parameter values for each crop. Simulated crop growth processes will reflect the vegetative seasonality of the natural watershed system. An existing model was employed for evaluating vegetative resistance by hydraulic and vegetative parameters incorporated into the WRM model. The improved WRM model will have the ability to evaluate the seasonal variation of the vegetative roughness coefficient with depth of flow and further enhance the hydrologic model’s capability for accurate hydrologic studies.

Keywords: runoff, roughness coefficient, PAR, WRM model

Procedia PDF Downloads 371
36159 The Extension of the Kano Model by the Concept of Over-Service

Authors: Lou-Hon Sun, Yu-Ming Chiu, Chen-Wei Tao, Chia-Yun Tsai

Abstract:

It is common practice for many companies to ask employees to provide heart-touching service for customers and to emphasize the attitude of 'customer first'. However, services may not necessarily gain praise, and may actually be considered excessive, if customers do not appreciate such behaviors. In reality, many restaurant businesses try to provide as much service as possible without taking into account whether over-provision may lead to negative customer reception. A survey of 894 people in Britain revealed that 49 percent of respondents consider over-attentive waiters the most annoying aspect of dining out. It can be seen that merely aiming to exceed customers’ expectations without actually addressing their needs, only further distances and dissociates the standard of services from the goals of customer satisfaction itself. Over-service is defined, as 'service provided that exceeds customer expectations, or simply that customers deemed redundant, resulting in negative perception'. It was found that customers’ reactions and complaints concerning over-service are not as intense as those against service failures caused by the inability to meet expectations; consequently, it is more difficult for managers to become aware of the existence of over-service. Thus the ability to manage over-service behaviors is a significant topic for consideration. The Kano model classifies customer preferences into five categories: attractive quality attribute, one-dimensional quality attribute, must-be quality attribute, indifferent quality attribute and reverse quality attributes. The model is still very popular for researchers to explore the quality aspects and customer satisfaction. Nevertheless, several studies indicated that Kano’s model could not fully capture the nature of service quality. The concept of over-service can be used to restructure the model and provide a better understanding of the service quality construct. In this research, the structure of Kano's two-dimensional questionnaire will be used to classify the factors into different dimensions. The same questions will be used in the second questionnaire for identifying the over-service experienced of the respondents. The finding of these two questionnaires will be used to analyze the relevance between service quality classification and over-service behaviors. The subjects of this research are customers of fine dining chain restaurants. Three hundred questionnaires will be issued based on the stratified random sampling method. Items for measurement will be derived from DINESERV scale. The tangible dimension of the questionnaire will be eliminated due to this research is focused on the employee behaviors. Quality attributes of the Kano model are often regarded as an instrument for improving customer satisfaction. The concept of over-service can be used to restructure the model and provide a better understanding of service quality construct. The extension of the Kano model will not only develop a better understanding of customer needs and expectations but also enhance the management of service quality.

Keywords: consumer satisfaction, DINESERV, kano model, over-service

Procedia PDF Downloads 160
36158 An Ecosystem Approach to Natural Resource Management: Case Study of the Topčiderska River, Serbia

Authors: Katarina Lazarević, Mirjana Todosijević, Tijana Vulević, Natalija Momirović, Ranka Erić

Abstract:

Due to increasing demand, climate change, and world population growth, natural resources are getting exploit fast. One of the most important natural resources is soil, which is susceptible to degradation. Erosion as one of the forms of land degradation is also one of the most global environmental problems. Ecosystem services are often defined as benefits that nature provides to humankind. Soil, as the foundation of basic ecosystem functions, provides benefits to people, erosion control, water infiltration, food, fuel, fibers… This research is using the ecosystem approach as a strategy for natural resources management for promoting sustainability and conservation. The research was done on the Topčiderska River basin (Belgrade, Serbia). The InVEST Sediment Delivery Ratio model was used, to quantify erosion intensity with a spatial distribution output map of overland sediment generation and delivery to the stream. InVEST SDR, a spatially explicit model, is using a method based on the concept of hydrological connectivity and (R) USLE model. This, combined with socio-economic and law and policy analysis, gives a full set of information to decision-makers helping them to successfully manage and deliver sustainable ecosystems.

Keywords: ecosystem services, InVEST model, soil erosion, sustainability

Procedia PDF Downloads 133
36157 Comparing Spontaneous Hydrolysis Rates of Activated Models of DNA and RNA

Authors: Mohamed S. Sasi, Adel M. Mlitan, Abdulfattah M. Alkherraz

Abstract:

This research project aims to investigate difference in relative rates concerning phosphoryl transfer relevant to biological catalysis of DNA and RNA in the pH-independent reactions. Activated Models of DNA and RNA for alkyl-aryl phosphate diesters (with 4-nitrophenyl as a good leaving group) have successfully been prepared to gather kinetic parameters. Eyring plots for the pH–independent hydrolysis of 1 and 2 were established at different temperatures in the range 100–160 °C. These measurements have been used to provide a better estimate for the difference in relative rates between the reactivity of DNA and RNA cleavage. Eyring plot gave an extrapolated rate of kH2O = 1 × 10-10 s -1 for 1 (RNA model) and 2 (DNA model) at 25°C. Comparing the reactivity of RNA model and DNA model shows that the difference in relative rates in the pH-independent reactions is surprisingly very similar at 25°. This allows us to obtain chemical insights into how biological catalysts such as enzymes may have evolved to perform their current functions.

Keywords: DNA and RNA models, relative rates, reactivity, phosphoryl transfe

Procedia PDF Downloads 418
36156 Focusing of Technology Monitoring Activities Using Indicators

Authors: Günther Schuh, Christina König, Toni Drescher

Abstract:

One of the key factors for the competitiveness and market success of technology-driven companies is the timely provision of information about emerging technologies, changes in existing technologies, as well as relevant related changes in the market's structures and participants. Therefore, many companies conduct technology intelligence (TI) activities to ensure an early identification of appropriate technologies and other (weak) signals. One base activity of TI is technology monitoring, which is defined as the systematic tracking of developments within a specified topic of interest as well as related trends over a long period of time. Due to the very large number of dynamically changing parameters within the technological and the market environment of a company as well as their possible interdependencies, it is necessary to focus technology monitoring on specific indicators or other criteria, which are able to point out technological developments and market changes. In addition to the execution of a literature review on existing approaches, which mainly propose patent-based indicators, it is examined in this paper whether indicator systems from other branches such as risk management or economic research could be transferred to technology monitoring in order to enable an efficient and focused technology monitoring for companies.

Keywords: technology forecasting, technology indicator, technology intelligence, technology management, technology monitoring

Procedia PDF Downloads 467
36155 Stock Market Prediction by Regression Model with Social Moods

Authors: Masahiro Ohmura, Koh Kakusho, Takeshi Okadome

Abstract:

This paper presents a regression model with autocorrelated errors in which the inputs are social moods obtained by analyzing the adjectives in Twitter posts using a document topic model. The regression model predicts Dow Jones Industrial Average (DJIA) more precisely than autoregressive moving-average models.

Keywords: stock market prediction, social moods, regression model, DJIA

Procedia PDF Downloads 543