Search results for: regional panel data model
36354 The Quality of Human Capital as a Factor of Social and Economic Development of the Region
Authors: O. Gubnitsyna, O. Zakoretskaya, O. Russova
Abstract:
It is generally recognized that the main task of modern society is human development. The quality of human capital has been identified as a key driver of economic development in the region. In this article, considered the quality of human capital as one of the main types of social and economic potential for the region’s development. The phenomenon of human capital represents both material and intellectual components of human activity. It is show that the necessary population characterized by certain quantitative and qualitative indicators (qualification and professional structure, education or social general condition and others) and is an necessary resource for the development of the regional economy. The connection of the regional goals with the quality of human capital is discussed in the article and a number of recommendations on its improvement were given. Solving the tasks stated in the article, the authors used analytical and statistical methods of research, scientific publications of domestic and foreign scientists on this issue. The results can be used in this implementation of the concept of regional development.Keywords: human capital, the quality of human capital, economic development, social general condition
Procedia PDF Downloads 29136353 Human Development Strengthening against Terrorism in ASEAN East Asia and Pacific: An Econometric Analysis
Authors: Tismazammi Mustafa, Jaharudin Padli
Abstract:
The frequency of terrorism is increasing throughout years that is resulting in loss of life, damaging people’s property, and destructing the environment. The incident of terrorism is not stationed in one particular country but has spread and scattered in other countries hence causing an increase in the number of terrorism cases. Thus, this paper aims to investigate the factors of human development upon the terrorism in East Asia and Pacific countries. This study used a panel ARDL model, in which it enables to capture the long run and the short run relationship among the variables of interest. Logit Model for Binary data is also used, in which to representing an attributes of dependent variables. This study focuses on several human development variables namely GDP per capita, population, human capital, land area, and technologies. The empirical finding revealed that the GDP per capita, population, human capital, land area, and technologies are positively and statistically significant in influencing the terrorism. Thus, the finding in this study will present as grounds to preserve human rights and develop public awareness and will offer guidelines to policy makers, emergency managers, first responders, public health workers, physicians, and other researchers.Keywords: terrorism, East Asia and Pacific, human development, econometric analysis
Procedia PDF Downloads 41436352 A Particle Filter-Based Data Assimilation Method for Discrete Event Simulation
Authors: Zhi Zhu, Boquan Zhang, Tian Jing, Jingjing Li, Tao Wang
Abstract:
Data assimilation is a model and data hybrid-driven method that dynamically fuses new observation data with a numerical model to iteratively approach the real system state. It is widely used in state prediction and parameter inference of continuous systems. Because of the discrete event system’s non-linearity and non-Gaussianity, traditional Kalman Filter based on linear and Gaussian assumptions cannot perform data assimilation for such systems, so particle filter has gradually become a technical approach for discrete event simulation data assimilation. Hence, we proposed a particle filter-based discrete event simulation data assimilation method and took the unmanned aerial vehicle (UAV) maintenance service system as a proof of concept to conduct simulation experiments. The experimental results showed that the filtered state data is closer to the real state of the system, which verifies the effectiveness of the proposed method. This research can provide a reference framework for the data assimilation process of other complex nonlinear systems, such as discrete-time and agent simulation.Keywords: discrete event simulation, data assimilation, particle filter, model and data-driven
Procedia PDF Downloads 1336351 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks
Authors: Wang Yichen, Haruka Yamashita
Abstract:
In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.Keywords: recurrent neural network, players lineup, basketball data, decision making model
Procedia PDF Downloads 13336350 Evaluating Radiative Feedback Mechanisms in Coastal West Africa Using Regional Climate Models
Authors: Akinnubi Rufus Temidayo
Abstract:
Coastal West Africa is highly sensitive to climate variability, driven by complex ocean-atmosphere interactions that shape temperature, precipitation, and extreme weather. Radiative feedback mechanisms—such as water vapor feedback, cloud-radiation interactions, and surface albedo—play a critical role in modulating these patterns. Yet, limited research addresses these feedbacks in climate models specific to West Africa’s coastal zones, creating challenges for accurate climate projections and adaptive planning. This study aims to evaluate the influence of radiative feedbacks on the coastal climate of West Africa by quantifying the effects of water vapor, cloud cover, and sea surface temperature (SST) on the region’s radiative balance. The study uses a regional climate model (RCM) to simulate feedbacks over a 20-year period (2005-2025) with high-resolution data from CORDEX and satellite observations. Key mechanisms investigated include (1) Water Vapor Feedback—the amplifying effect of humidity on warming, (2) Cloud-Radiation Interactions—the impact of cloud cover on radiation balance, especially during the West African Monsoon, and (3) Surface Albedo and Land-Use Changes—effects of urbanization and vegetation on the radiation budget. Preliminary results indicate that radiative feedbacks strongly influence seasonal climate variability in coastal West Africa. Water vapor feedback amplifies dry-season warming, cloud-radiation interactions moderate surface temperatures during monsoon seasons, and SST variations in the Atlantic affect the frequency and intensity of extreme rainfall events. The findings suggest that incorporating these feedbacks into climate planning can strengthen resilience to climate impacts in West African coastal communities. Further research should refine regional models to capture anthropogenic influences like greenhouse gas emissions, guiding sustainable urban and resource planning to mitigate climate risks.Keywords: west africa, radiative, climate, resilence, anthropogenic
Procedia PDF Downloads 836349 Grid Tied Photovoltaic Power on School Roof
Authors: Yeong-cheng Wang, Jin-Yinn Wang, Ming-Shan Lin, Jian-Li Dong
Abstract:
To universalize the adoption of sustainable energy, the R.O.C. government encourages public buildings to introduce the PV power station on the building roof, whereas most old buildings did not include the considerations of photovoltaic (PV) power facilities in the design phase. Several factors affect the PV electricity output, the temperature is the key one, different PV technologies have different temperature coefficients. Other factors like PV panel azimuth, panel inclination from the horizontal plane, and row to row distance of PV arrays, mix up at the beginning of system design. The goal of this work is to maximize the annual energy output of a roof mount PV system. Tables to simplify the design work are developed; the results can be used for engineering project quote directly.Keywords: optimal inclination, array azimuth, annual output
Procedia PDF Downloads 67736348 Conceptual Design of Panel Based Reinforced Concrete Floating Substructure for 10 MW Offshore Wind Turbine
Authors: M. Sohail Hasan, Wichuda Munbua, Chikako Fujiyama, Koichi Maekawa
Abstract:
During the past few years, offshore wind energy has become the key parameter to reduce carbon emissions. In most of the previous studies, floaters in floating offshore wind turbines (FOWT) are made up of steel. However, fatigue and corrosion are always major concerns of steel marine structures. Recently, researchers are working on concrete floating substructures. In this paper, the conceptual design of pre-cast panel-based economical and durable reinforced concrete floating substructure for a 10 MW offshore wind turbine is proposed. The new geometrical shape, i.e., hexagon with inside hollow boxes, is proposed under static conditions. To design the outer panel/side walls to resist hydrostatic forces, special consideration for durability is given to limit the crack width within permissible range under service limit state. A comprehensive system is proposed for transferring the ultimate moment and shear due to strong wind at the connection between steel tower and concrete floating substructure. Moreover, a stable connection is also designed considering the fatigue of concrete and steel due to the fluctuation of stress from the mooring line. This conceptual design will be verified by subsequent dynamic analysis soon.Keywords: cracks width control, mooring line, reinforced concrete floater, steel tower
Procedia PDF Downloads 22336347 Geographic Information System Application for Predicting Tourism Development in Gunungkidul Regency, Indonesia
Authors: Nindyo Cahyo Kresnanto, Muhamad Willdan, Wika Harisa Putri
Abstract:
Gunungkidul is one of the emerging tourism industry areas in Yogyakarta Province, Indonesia. This article describes how GIS can predict the development of tourism potential in Gunungkidul. The tourism sector in Gunungkidul Regency contributes 3.34% of the total gross regional domestic product and is the economic sector with the highest growth with a percentage of 18.37% in the post-Covid-19 period. This contribution makes researchers consider that several tourist sites need to be explored more to increase regional economic development gradually. This research starts by collecting spatial data from tourist locations tourists want to visit in Gunungkidul Regency based on survey data from 571 respondents. Then the data is visualized with ArcGIS software. This research shows an overview of tourist destinations interested in travellers depicted from the lowest to the highest from the data visualization. Based on the data visualization results, specific tourist locations potentially developed to influence the surrounding economy positively. The visualization of the data displayed is also in the form of a desire line map that shows tourist travel patterns from the origin of the tourist to the destination of the tourist location of interest. From the desire line, the prediction of the path of tourist sites with a high frequency of transportation activity can figure out. Predictions regarding specific tourist location routes that high transportation activities can burden can consider which routes will be chosen. The route also needs to be improved in terms of capacity and quality. The goal is to provide a sense of security and comfort for tourists who drive and positively impact the tourist sites traversed by the route.Keywords: tourism development, GIS and survey, transportation, potential desire line
Procedia PDF Downloads 6636346 Determinants of Profitability in Indian Pharmaceutical Firms in the New Intellectual Property Rights Regime
Authors: Shilpi Tyagi, D. K. Nauriyal
Abstract:
This study investigates the firm level determinants of profitability of Indian drug and pharmaceutical industry. The study uses inflation adjusted panel data for a period 2000-2013 and applies OLS regression model with Driscoll-Kraay standard errors. It has been found that export intensity, A&M intensity, firm’s market power and stronger patent regime dummy have exercised positive influence on profitability. The negative and statistically significant influence of R&D intensity and raw material import intensity points to the need for firms to adopt suitable investment strategies. The study suggests that firms are required to pay far more attention to optimize their operating expenditures, advertisement and marketing expenditures and improve their export orientation, as part of the long term strategy.Keywords: Indian pharmaceutical industry, profits, TRIPS, performance
Procedia PDF Downloads 43636345 Sampled-Data Model Predictive Tracking Control for Mobile Robot
Authors: Wookyong Kwon, Sangmoon Lee
Abstract:
In this paper, a sampled-data model predictive tracking control method is presented for mobile robots which is modeled as constrained continuous-time linear parameter varying (LPV) systems. The presented sampled-data predictive controller is designed by linear matrix inequality approach. Based on the input delay approach, a controller design condition is derived by constructing a new Lyapunov function. Finally, a numerical example is given to demonstrate the effectiveness of the presented method.Keywords: model predictive control, sampled-data control, linear parameter varying systems, LPV
Procedia PDF Downloads 30936344 R Software for Parameter Estimation of Spatio-Temporal Model
Authors: Budi Nurani Ruchjana, Atje Setiawan Abdullah, I. Gede Nyoman Mindra Jaya, Eddy Hermawan
Abstract:
In this paper, we propose the application package to estimate parameters of spatiotemporal model based on the multivariate time series analysis using the R open-source software. We build packages mainly to estimate the parameters of the Generalized Space Time Autoregressive (GSTAR) model. GSTAR is a combination of time series and spatial models that have parameters vary per location. We use the method of Ordinary Least Squares (OLS) and use the Mean Average Percentage Error (MAPE) to fit the model to spatiotemporal real phenomenon. For case study, we use oil production data from volcanic layer at Jatibarang Indonesia or climate data such as rainfall in Indonesia. Software R is very user-friendly and it is making calculation easier, processing the data is accurate and faster. Limitations R script for the estimation of model parameters spatiotemporal GSTAR built is still limited to a stationary time series model. Therefore, the R program under windows can be developed either for theoretical studies and application.Keywords: GSTAR Model, MAPE, OLS method, oil production, R software
Procedia PDF Downloads 24236343 Impact Evaluation and Technical Efficiency in Ethiopia: Correcting for Selectivity Bias in Stochastic Frontier Analysis
Authors: Tefera Kebede Leyu
Abstract:
The purpose of this study was to estimate the impact of LIVES project participation on the level of technical efficiency of farm households in three regions of Ethiopia. We used household-level data gathered by IRLI between February and April 2014 for the year 2013(retroactive). Data on 1,905 (754 intervention and 1, 151 control groups) sample households were analyzed using STATA software package version 14. Efforts were made to combine stochastic frontier modeling with impact evaluation methodology using the Heckman (1979) two-stage model to deal with possible selectivity bias arising from unobservable characteristics in the stochastic frontier model. Results indicate that farmers in the two groups are not efficient and operate below their potential frontiers i.e., there is a potential to increase crop productivity through efficiency improvements in both groups. In addition, the empirical results revealed selection bias in both groups of farmers confirming the justification for the use of selection bias corrected stochastic frontier model. It was also found that intervention farmers achieved higher technical efficiency scores than the control group of farmers. Furthermore, the selectivity bias-corrected model showed a different technical efficiency score for the intervention farmers while it more or less remained the same for that of control group farmers. However, the control group of farmers shows a higher dispersion as measured by the coefficient of variation compared to the intervention counterparts. Among the explanatory variables, the study found that farmer’s age (proxy to farm experience), land certification, frequency of visit to improved seed center, farmer’s education and row planting are important contributing factors for participation decisions and hence technical efficiency of farmers in the study areas. We recommend that policies targeting the design of development intervention programs in the agricultural sector focus more on providing farmers with on-farm visits by extension workers, provision of credit services, establishment of farmers’ training centers and adoption of modern farm technologies. Finally, we recommend further research to deal with this kind of methodological framework using a panel data set to test whether technical efficiency starts to increase or decrease with the length of time that farmers participate in development programs.Keywords: impact evaluation, efficiency analysis and selection bias, stochastic frontier model, Heckman-two step
Procedia PDF Downloads 7536342 Numerical Solution to Coupled Heat and Moisture Diffusion in Bio-Sourced Composite Materials
Authors: Mnasri Faiza, El Ganaoui Mohammed, Khelifa Mourad, Gabsi Slimane
Abstract:
The main objective of this paper is to describe the hydrothermal behavior through porous material of construction due to temperature gradient. The construction proposed a bi-layer structure which composed of two different materials. The first is a bio-sourced panel named IBS-AKU (inertia system building), the second is the Neopor material. This system (IBS-AKU Neopor) is developed by a Belgium company (Isohabitat). The study suggests a multi-layer structure of the IBS-AKU panel in one dimension. A numerical method was proposed afterwards, by using the finite element method and a refined mesh area to strong gradients. The evolution of temperature fields and the moisture content has been processed.Keywords: heat transfer, moisture diffusion, porous media, composite IBS-AKU, simulation
Procedia PDF Downloads 50636341 Estimation of Chronic Kidney Disease Using Artificial Neural Network
Authors: Ilker Ali Ozkan
Abstract:
In this study, an artificial neural network model has been developed to estimate chronic kidney failure which is a common disease. The patients’ age, their blood and biochemical values, and 24 input data which consists of various chronic diseases are used for the estimation process. The input data have been subjected to preprocessing because they contain both missing values and nominal values. 147 patient data which was obtained from the preprocessing have been divided into as 70% training and 30% testing data. As a result of the study, artificial neural network model with 25 neurons in the hidden layer has been found as the model with the lowest error value. Chronic kidney failure disease has been able to be estimated accurately at the rate of 99.3% using this artificial neural network model. The developed artificial neural network has been found successful for the estimation of chronic kidney failure disease using clinical data.Keywords: estimation, artificial neural network, chronic kidney failure disease, disease diagnosis
Procedia PDF Downloads 44736340 On-Line Data-Driven Multivariate Statistical Prediction Approach to Production Monitoring
Authors: Hyun-Woo Cho
Abstract:
Detection of incipient abnormal events in production processes is important to improve safety and reliability of manufacturing operations and reduce losses caused by failures. The construction of calibration models for predicting faulty conditions is quite essential in making decisions on when to perform preventive maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of process measurement data. The calibration model is used to predict faulty conditions from historical reference data. This approach utilizes variable selection techniques, and the predictive performance of several prediction methods are evaluated using real data. The results shows that the calibration model based on supervised probabilistic model yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.Keywords: calibration model, monitoring, quality improvement, feature selection
Procedia PDF Downloads 35536339 Masked Candlestick Model: A Pre-Trained Model for Trading Prediction
Authors: Ling Qi, Matloob Khushi, Josiah Poon
Abstract:
This paper introduces a pre-trained Masked Candlestick Model (MCM) for trading time-series data. The pre-trained model is based on three core designs. First, we convert trading price data at each data point as a set of normalized elements and produce embeddings of each element. Second, we generate a masked sequence of such embedded elements as inputs for self-supervised learning. Third, we use the encoder mechanism from the transformer to train the inputs. The masked model learns the contextual relations among the sequence of embedded elements, which can aid downstream classification tasks. To evaluate the performance of the pre-trained model, we fine-tune MCM for three different downstream classification tasks to predict future price trends. The fine-tuned models achieved better accuracy rates for all three tasks than the baseline models. To better analyze the effectiveness of MCM, we test the same architecture for three currency pairs, namely EUR/GBP, AUD/USD, and EUR/JPY. The experimentation results demonstrate MCM’s effectiveness on all three currency pairs and indicate the MCM’s capability for signal extraction from trading data.Keywords: masked language model, transformer, time series prediction, trading prediction, embedding, transfer learning, self-supervised learning
Procedia PDF Downloads 12536338 Conceptualizing the Knowledge to Manage and Utilize Data Assets in the Context of Digitization: Case Studies of Multinational Industrial Enterprises
Authors: Martin Böhmer, Agatha Dabrowski, Boris Otto
Abstract:
The trend of digitization significantly changes the role of data for enterprises. Data turn from an enabler to an intangible organizational asset that requires management and qualifies as a tradeable good. The idea of a networked economy has gained momentum in the data domain as collaborative approaches for data management emerge. Traditional organizational knowledge consequently needs to be extended by comprehensive knowledge about data. The knowledge about data is vital for organizations to ensure that data quality requirements are met and data can be effectively utilized and sovereignly governed. As this specific knowledge has been paid little attention to so far by academics, the aim of the research presented in this paper is to conceptualize it by proposing a “data knowledge model”. Relevant model entities have been identified based on a design science research (DSR) approach that iteratively integrates insights of various industry case studies and literature research.Keywords: data management, digitization, industry 4.0, knowledge engineering, metamodel
Procedia PDF Downloads 35636337 The Impact of Diversification Strategy on Leverage and Accrual-Based Earnings Management
Authors: Safa Lazzem, Faouzi Jilani
Abstract:
The aim of this research is to investigate the impact of diversification strategy on the nature of the relationship between leverage and accrual-based earnings management through panel-estimation techniques based on a sample of 162 nonfinancial French firms indexed in CAC All-Tradable during the period from 2006 to 2012. The empirical results show that leverage increases encourage managers to manipulate earnings management. Our findings prove that the diversification strategy provides the needed context for this accounting practice to be possible in highly diversified firms. In addition, the results indicate that diversification moderates the relationship between leverage and accrual-based earnings management by changing the nature and the sign of this relationship.Keywords: diversification, earnings management, leverage, panel-estimation techniques
Procedia PDF Downloads 15036336 A DEA Model in a Multi-Objective Optimization with Fuzzy Environment
Authors: Michael Gidey Gebru
Abstract:
Most DEA models operate in a static environment with input and output parameters that are chosen by deterministic data. However, due to ambiguity brought on shifting market conditions, input and output data are not always precisely gathered in real-world scenarios. Fuzzy numbers can be used to address this kind of ambiguity in input and output data. Therefore, this work aims to expand crisp DEA into DEA with fuzzy environment. In this study, the input and output data are regarded as fuzzy triangular numbers. Then, the DEA model with fuzzy environment is solved using a multi-objective method to gauge the Decision Making Units’ efficiency. Finally, the developed DEA model is illustrated with an application on real data 50 educational institutions.Keywords: efficiency, DEA, fuzzy, decision making units, higher education institutions
Procedia PDF Downloads 5236335 Overview of a Quantum Model for Decision Support in a Sensor Network
Authors: Shahram Payandeh
Abstract:
This paper presents an overview of a model which can be used as a part of a decision support system when fusing information from multiple sensing environment. Data fusion has been widely studied in the past few decades and numerous frameworks have been proposed to facilitate decision making process under uncertainties. Multi-sensor data fusion technology plays an increasingly significant role during people tracking and activity recognition. This paper presents an overview of a quantum model as a part of a decision-making process in the context of multi-sensor data fusion. The paper presents basic definitions and relationships associating the decision-making process and quantum model formulation in the presence of uncertainties.Keywords: quantum model, sensor space, sensor network, decision support
Procedia PDF Downloads 22736334 Review of Downscaling Methods in Climate Change and Their Role in Hydrological Studies
Authors: Nishi Bhuvandas, P. V. Timbadiya, P. L. Patel, P. D. Porey
Abstract:
Recent perceived climate variability raises concerns with unprecedented hydrological phenomena and extremes. Distribution and circulation of the waters of the Earth become increasingly difficult to determine because of additional uncertainty related to anthropogenic emissions. According to the sixth Intergovernmental Panel on Climate Change (IPCC) Technical Paper on Climate Change and water, changes in the large-scale hydrological cycle have been related to an increase in the observed temperature over several decades. Although many previous research carried on effect of change in climate on hydrology provides a general picture of possible hydrological global change, new tools and frameworks for modelling hydrological series with nonstationary characteristics at finer scales, are required for assessing climate change impacts. Of the downscaling techniques, dynamic downscaling is usually based on the use of Regional Climate Models (RCMs), which generate finer resolution output based on atmospheric physics over a region using General Circulation Model (GCM) fields as boundary conditions. However, RCMs are not expected to capture the observed spatial precipitation extremes at a fine cell scale or at a basin scale. Statistical downscaling derives a statistical or empirical relationship between the variables simulated by the GCMs, called predictors, and station-scale hydrologic variables, called predictands. The main focus of the paper is on the need for using statistical downscaling techniques for projection of local hydrometeorological variables under climate change scenarios. The projections can be then served as a means of input source to various hydrologic models to obtain streamflow, evapotranspiration, soil moisture and other hydrological variables of interest.Keywords: climate change, downscaling, GCM, RCM
Procedia PDF Downloads 40636333 On Panel Data Analysis of Factors on Economic Advances in Some African Countries
Authors: Ayoola Femi J., Kayode Balogun
Abstract:
In some African Countries, increase in Gross Domestic Products (GDP) has not translated to real development as expected by common-man in his household. For decades, a lot of contests on economic growth and development has been a nagging issues. The focus of this study is to analysing the effects of economic determinants/factors on economic advances in some African Countries by employing panel data analysis. The yearly (1990-2013) data were obtained from the world economic outlook database of the International Monetary Fund (IMF), for probing the effects of these variables on growth rate in some selected African countries which include: Nigeria, Algeria, Angola, Benin, Botswana, Burundi, Cape-Verde, Cameroun, Central African Republic, Chad, Republic Of Congo, Cote di’ Voire, Egypt, Equatorial-Guinea, Ethiopia, Gabon, Ghana, Guinea Bissau, Kenya, Lesotho, Madagascar, Mali, Mauritius, Morocco, Mozambique, Niger, Rwanda, Senegal, Seychelles, Sierra Leone, South Africa, Sudan, Swaziland, Tanzania, Togo, Tunisia, and Uganda. The effects of 6 macroeconomic variables on GDP were critically examined. We used 37 Countries GDP as our dependent variable and 6 independent variables used in this study include: Total Investment (totinv), Inflation (inf), Population (popl), current account balance (cab), volume of imports of goods and services (vimgs), and volume of exports of goods and services (vexgs). The results of our analysis shows that total investment, population and volume of exports of goods and services strongly affect the economic growth. We noticed that population of these selected countries positively affect the GDP while total investment and volume of exports negatively affect GDP. On the contrary, inflation, current account balance and volume of imports of goods and services’ contribution to the GDP are insignificant. The results of our analysis shows that total investment, population and volume of exports of goods and services strongly affect the economic growth. We noticed that population of these selected countries positively affect the GDP while total investment and volume of exports negatively affect GDP. On the contrary, inflation, current account balance and volume of imports of goods and services’ contribution to the GDP are insignificant. The results of this study would be useful for individual African governments for developing a suitable and appropriate economic policies and strategies. It will also help investors to understand the economic nature and viability of Africa as a continent as well as its individual countries.Keywords: African countries, economic growth and development, gross domestic products, static panel data models
Procedia PDF Downloads 47536332 Modeling of Tsunami Propagation and Impact on West Vancouver Island, Canada
Authors: S. Chowdhury, A. Corlett
Abstract:
Large tsunamis strike the British Columbia coast every few hundred years. The Cascadia Subduction Zone, which extends along the Pacific coast from Vancouver Island to Northern California is one of the most seismically active regions in Canada. Significant earthquakes have occurred in this region, including the 1700 Cascade Earthquake with an estimated magnitude of 9.2. Based on geological records, experts have predicted a 'great earthquake' of a similar magnitude within this region may happen any time. This earthquake is expected to generate a large tsunami that could impact the coastal communities on Vancouver Island. Since many of these communities are in remote locations, they are more likely to be vulnerable, as the post-earthquake relief efforts would be impacted by the damage to critical road infrastructures. To assess the coastal vulnerability within these communities, a hydrodynamic model has been developed using MIKE-21 software. We have considered a 500 year probabilistic earthquake design criteria including the subsidence in this model. The bathymetry information was collected from Canadian Hydrographic Services (CHS), and National Oceanic Atmospheric and Administration (NOAA). The arial survey was conducted using a Cessna-172 aircraft for the communities, and then the information was converted to generate a topographic digital elevation map. Both survey information was incorporated into the model, and the domain size of the model was about 1000km x 1300km. This model was calibrated with the tsunami occurred off the west coast of Moresby Island on October 28, 2012. The water levels from the model were compared with two tide gauge stations close to the Vancouver Island and the output from the model indicates the satisfactory result. For this study, the design water level was considered as High Water Level plus the Sea Level Rise for 2100 year. The hourly wind speeds from eight directions were collected from different wind stations and used a 200-year return period wind speed in the model for storm events. The regional model was set for 12 hrs simulation period, which takes more than 16 hrs to complete one simulation using double Xeon-E7 CPU computer plus a K-80 GPU. The boundary information for the local model was generated from the regional model. The local model was developed using a high resolution mesh to estimate the coastal flooding for the communities. It was observed from this study that many communities will be effected by the Cascadia tsunami and the inundation maps were developed for the communities. The infrastructures inside the coastal inundation area were identified. Coastal vulnerability planning and resilient design solutions will be implemented to significantly reduce the risk.Keywords: tsunami, coastal flooding, coastal vulnerable, earthquake, Vancouver, wave propagation
Procedia PDF Downloads 13036331 A Statistical Approach to Classification of Agricultural Regions
Authors: Hasan Vural
Abstract:
Turkey is a favorable country to produce a great variety of agricultural products because of her different geographic and climatic conditions which have been used to divide the country into four main and seven sub regions. This classification into seven regions traditionally has been used in order to data collection and publication especially related with agricultural production. Afterwards, nine agricultural regions were considered. Recently, the governmental body which is responsible of data collection and dissemination (Turkish Institute of Statistics-TIS) has used 12 classes which include 11 sub regions and Istanbul province. This study aims to evaluate these classification efforts based on the acreage of ten main crops in a ten years time period (1996-2005). The panel data grouped in 11 subregions has been evaluated by cluster and multivariate statistical methods. It was concluded that from the agricultural production point of view, it will be rather meaningful to consider three main and eight sub-agricultural regions throughout the country.Keywords: agricultural region, factorial analysis, cluster analysis,
Procedia PDF Downloads 41536330 Multistage Data Envelopment Analysis Model for Malmquist Productivity Index Using Grey's System Theory to Evaluate Performance of Electric Power Supply Chain in Iran
Authors: Mesbaholdin Salami, Farzad Movahedi Sobhani, Mohammad Sadegh Ghazizadeh
Abstract:
Evaluation of organizational performance is among the most important measures that help organizations and entities continuously improve their efficiency. Organizations can use the existing data and results from the comparison of units under investigation to obtain an estimation of their performance. The Malmquist Productivity Index (MPI) is an important index in the evaluation of overall productivity, which considers technological developments and technical efficiency at the same time. This article proposed a model based on the multistage MPI, considering limited data (Grey’s theory). This model can evaluate the performance of units using limited and uncertain data in a multistage process. It was applied by the electricity market manager to Iran’s electric power supply chain (EPSC), which contains uncertain data, to evaluate the performance of its actors. Results from solving the model showed an improvement in the accuracy of future performance of the units under investigation, using the Grey’s system theory. This model can be used in all case studies, in which MPI is used and there are limited or uncertain data.Keywords: Malmquist Index, Grey's Theory, CCR Model, network data envelopment analysis, Iran electricity power chain
Procedia PDF Downloads 16436329 The Efficacy of Government Strategies to Control COVID 19: Evidence from 22 High Covid Fatality Rated Countries
Authors: Imalka Wasana Rathnayaka, Rasheda Khanam, Mohammad Mafizur Rahman
Abstract:
TheCOVID-19 pandemic has created unprecedented challenges to both the health and economic states in countries around the world. This study aims to evaluate the effectiveness of governments' decisions to mitigate the risks of COVID-19 through proposing policy directions to reduce its magnitude. The study is motivated by the ongoing coronavirus outbreaks and comprehensive policy responses taken by countries to mitigate the spread of COVID-19 and reduce death rates. This study contributes to filling the knowledge by exploiting the long-term efficacy of extensive plans of governments. This study employs a Panel autoregressive distributed lag (ARDL) framework. The panels incorporate both a significant number of variables and fortnightly observations from22 countries. The dependent variables adopted in this study are the fortnightly death rates and the rates of the spread of COVID-19. Mortality rate and the rate of infection data were computed based on the number of deaths and the number of new cases per 10000 people.The explanatory variables are fortnightly values of indexes taken to investigate the efficacy of government interventions to control COVID-19. Overall government response index, Stringency index, Containment and health index, and Economic support index were selected as explanatory variables. The study relies on the Oxford COVID-19 Government Measure Tracker (OxCGRT). According to the procedures of ARDL, the study employs (i) the unit root test to check stationarity, (ii) panel cointegration, and (iii) PMG and ARDL estimation techniques. The study shows that the COVID-19 pandemic forced immediate responses from policymakers across the world to mitigate the risks of COVID-19. Of the four types of government policy interventions: (i) Stringency and (ii) Economic Support have been most effective and reveal that facilitating Stringency and financial measures has resulted in a reduction in infection and fatality rates, while (iii) Government responses are positively associated with deaths but negatively with infected cases. Even though this positive relationship is unexpected to some extent in the long run, social distancing norms of the governments have been broken by the public in some countries, and population age demographics would be a possible reason for that result. (iv) Containment and healthcare improvements reduce death rates but increase the infection rates, although the effect has been lower (in absolute value). The model implies that implementation of containment health practices without association with tracing and individual-level quarantine does not work well. The policy implication based on containment health measures must be applied together with targeted, aggressive, and rapid containment to extensively reduce the number of people infected with COVID 19. Furthermore, the results demonstrate that economic support for income and debt relief has been the key to suppressing the rate of COVID-19 infections and fatality rates.Keywords: COVID-19, infection rate, deaths rate, government response, panel data
Procedia PDF Downloads 7636328 Continual Learning Using Data Generation for Hyperspectral Remote Sensing Scene Classification
Authors: Samiah Alammari, Nassim Ammour
Abstract:
When providing a massive number of tasks successively to a deep learning process, a good performance of the model requires preserving the previous tasks data to retrain the model for each upcoming classification. Otherwise, the model performs poorly due to the catastrophic forgetting phenomenon. To overcome this shortcoming, we developed a successful continual learning deep model for remote sensing hyperspectral image regions classification. The proposed neural network architecture encapsulates two trainable subnetworks. The first module adapts its weights by minimizing the discrimination error between the land-cover classes during the new task learning, and the second module tries to learn how to replicate the data of the previous tasks by discovering the latent data structure of the new task dataset. We conduct experiments on HSI dataset Indian Pines. The results confirm the capability of the proposed method.Keywords: continual learning, data reconstruction, remote sensing, hyperspectral image segmentation
Procedia PDF Downloads 26636327 Testing the Life Cycle Theory on the Capital Structure Dynamics of Trade-Off and Pecking Order Theories: A Case of Retail, Industrial and Mining Sectors
Authors: Freddy Munzhelele
Abstract:
Setting: the empirical research has shown that the life cycle theory has an impact on the firms’ financing decisions, particularly the dividend pay-outs. Accordingly, the life cycle theory posits that as a firm matures, it gets to a level and capacity where it distributes more cash as dividends. On the other hand, the young firms prioritise investment opportunities sets and their financing; thus, they pay little or no dividends. The research on firms’ financing decisions also demonstrated, among others, the adoption of trade-off and pecking order theories on the dynamics of firms capital structure. The trade-off theory talks to firms holding a favourable position regarding debt structures particularly as to the cost and benefits thereof; and pecking order is concerned with firms preferring a hierarchical order as to choosing financing sources. The case of life cycle hypothesis explaining the financial managers’ decisions as regards the firms’ capital structure dynamics appears to be an interesting link, yet this link has been neglected in corporate finance research. If this link is to be explored as an empirical research, the financial decision-making alternatives will be enhanced immensely, since no conclusive evidence has been found yet as to the dynamics of capital structure. Aim: the aim of this study is to examine the impact of life cycle theory on the capital structure dynamics trade-off and pecking order theories of firms listed in retail, industrial and mining sectors of the JSE. These sectors are among the key contributors to the GDP in the South African economy. Design and methodology: following the postpositivist research paradigm, the study is quantitative in nature and utilises secondary data obtainable from the financial statements of sampled firm for the period 2010 – 2022. The firms’ financial statements will be extracted from the IRESS database. Since the data will be in panel form, a combination of the static and dynamic panel data estimators will used to analyse data. The overall data analyses will be done using STATA program. Value add: this study directly investigates the link between the life cycle theory and the dynamics of capital structure decisions, particularly the trade-off and pecking order theories.Keywords: life cycle theory, trade-off theory, pecking order theory, capital structure, JSE listed firms
Procedia PDF Downloads 6136326 A Development of Science Instructional Model Based on Stem Education Approach to Enhance Scientific Mind and Problem Solving Skills for Primary Students
Authors: Prasita Sooksamran, Wareerat Kaewurai
Abstract:
STEM is an integrated teaching approach promoted by the Ministry of Education in Thailand. STEM Education is an integrated approach to teaching Science, Technology, Engineering, and Mathematics. It has been questioned by Thai teachers on the grounds of how to integrate STEM into the classroom. Therefore, the main objective of this study is to develop a science instructional model based on the STEM approach to enhance scientific mind and problem-solving skills for primary students. This study is participatory action research, and follows the following steps: 1) develop a model 2) seek the advice of experts regarding the teaching model. Developing the instructional model began with the collection and synthesis of information from relevant documents, related research and other sources in order to create prototype instructional model. 2) The examination of the validity and relevance of instructional model by a panel of nine experts. The findings were as follows: 1. The developed instructional model comprised of principles, objective, content, operational procedures and learning evaluation. There were 4 principles: 1) Learning based on the natural curiosity of primary school level children leading to knowledge inquiry, understanding and knowledge construction, 2) Learning based on the interrelation between people and environment, 3) Learning that is based on concrete learning experiences, exploration and the seeking of knowledge, 4) Learning based on the self-construction of knowledge, creativity, innovation and 5) relating their findings to real life and the solving of real-life problems. The objective of this construction model is to enhance scientific mind and problem-solving skills. Children will be evaluated according to their achievements. Lesson content is based on science as a core subject which is integrated with technology and mathematics at grade 6 level according to The Basic Education Core Curriculum 2008 guidelines. The operational procedures consisted of 6 steps: 1) Curiosity 2) Collection of data 3) Collaborative planning 4) Creativity and Innovation 5) Criticism and 6) Communication and Service. The learning evaluation is an authentic assessment based on continuous evaluation of all the material taught. 2. The experts agreed that the Science Instructional Model based on the STEM Education Approach had an excellent level of validity and relevance (4.67 S.D. 0.50).Keywords: instructional model, STEM education, scientific mind, problem solving
Procedia PDF Downloads 19236325 Time Series Regression with Meta-Clusters
Authors: Monika Chuchro
Abstract:
This paper presents a preliminary attempt to apply classification of time series using meta-clusters in order to improve the quality of regression models. In this case, clustering was performed as a method to obtain a subgroups of time series data with normal distribution from inflow into waste water treatment plant data which Composed of several groups differing by mean value. Two simple algorithms: K-mean and EM were chosen as a clustering method. The rand index was used to measure the similarity. After simple meta-clustering, regression model was performed for each subgroups. The final model was a sum of subgroups models. The quality of obtained model was compared with the regression model made using the same explanatory variables but with no clustering of data. Results were compared by determination coefficient (R2), measure of prediction accuracy mean absolute percentage error (MAPE) and comparison on linear chart. Preliminary results allows to foresee the potential of the presented technique.Keywords: clustering, data analysis, data mining, predictive models
Procedia PDF Downloads 466