Search results for: forecast aggregation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 728

Search results for: forecast aggregation

698 Gas Aggregation and Nanobubbles Stability on Substrates Influenced by Surface Wettability: A Molecular Dynamics Study

Authors: Tsu-Hsu Yen

Abstract:

The interfacial gas adsorption presents a frequent challenge and opportunity for micro-/nano-fluidic operation. In this study, we investigate the wettability, gas accumulation, and nanobubble formation on various homogeneous surface conditions by using MD simulation, including a series of 3D and quasi-2D argon-water-solid systems simulation. To precisely determine the wettability on various substrates, several indicators were calculated. Among these wettability indicators, the water PMF (potential of mean force) has the most correlation tendency with interfacial water molecular orientation than depletion layer width and droplet contact angle. The results reveal that the aggregation of argon molecules on substrates not only depending on the level of hydrophobicity but also determined by the competition between gas-solid and water-solid interaction as well as water molecular structure near the surface. In addition, the surface nanobubble is always observed coexisted with the gas enrichment layer. The water structure adjacent to water-gas and water-solid interfaces also plays an important factor in gas out-flux and gas aggregation, respectively. The quasi-2D simulation shows that only a slight difference in the curved argon-water interface from the plane interface which suggests no noticeable obstructing effect on gas outflux from the gas-water interfacial water networks.

Keywords: gas aggregation, interfacial nanobubble, molecular dynamics simulation, wettability

Procedia PDF Downloads 86
697 The Investigate Relationship between Moral Hazard and Corporate Governance with Earning Forecast Quality in the Tehran Stock Exchange

Authors: Fatemeh Rouhi, Hadi Nassiri

Abstract:

Earning forecast is a key element in economic decisions but there are some situations, such as conflicts of interest in financial reporting, complexity and lack of direct access to information has led to the phenomenon of information asymmetry among individuals within the organization and external investors and creditors that appear. The adverse selection and moral hazard in the investor's decision and allows direct assessment of the difficulties associated with data by users makes. In this regard, the role of trustees in corporate governance disclosure is crystallized that includes controls and procedures to ensure the lack of movement in the interests of the company's management and move in the direction of maximizing shareholder and company value. Therefore, the earning forecast of companies in the capital market and the need to identify factors influencing this study was an attempt to make relationship between moral hazard and corporate governance with earning forecast quality companies operating in the capital market and its impact on Earnings Forecasts quality by the company to be established. Getting inspiring from the theoretical basis of research, two main hypotheses and sub-hypotheses are presented in this study, which have been examined on the basis of available models, and with the use of Panel-Data method, and at the end, the conclusion has been made at the assurance level of 95% according to the meaningfulness of the model and each independent variable. In examining the models, firstly, Chow Test was used to specify either Panel Data method should be used or Pooled method. Following that Housman Test was applied to make use of Random Effects or Fixed Effects. Findings of the study show because most of the variables are positively associated with moral hazard with earnings forecasts quality, with increasing moral hazard, earning forecast quality companies listed on the Tehran Stock Exchange is increasing. Among the variables related to corporate governance, board independence variables have a significant relationship with earnings forecast accuracy and earnings forecast bias but the relationship between board size and earnings forecast quality is not statistically significant.

Keywords: corporate governance, earning forecast quality, moral hazard, financial sciences

Procedia PDF Downloads 293
696 Forecasting Solid Waste Generation in Turkey

Authors: Yeliz Ekinci, Melis Koyuncu

Abstract:

Successful planning of solid waste management systems requires successful prediction of the amount of solid waste generated in an area. Waste management planning can protect the environment and human health, hence it is tremendously important for countries. The lack of information in waste generation can cause many environmental and health problems. Turkey is a country that plans to join European Union, hence, solid waste management is one of the most significant criteria that should be handled in order to be a part of this community. Solid waste management system requires a good forecast of solid waste generation. Thus, this study aims to forecast solid waste generation in Turkey. Artificial Neural Network and Linear Regression models will be used for this aim. Many models will be run and the best one will be selected based on some predetermined performance measures.

Keywords: forecast, solid waste generation, solid waste management, Turkey

Procedia PDF Downloads 481
695 Hourly Solar Radiations Predictions for Anticipatory Control of Electrically Heated Floor: Use of Online Weather Conditions Forecast

Authors: Helene Thieblemont, Fariborz Haghighat

Abstract:

Energy storage systems play a crucial role in decreasing building energy consumption during peak periods and expand the use of renewable energies in buildings. To provide a high building thermal performance, the energy storage system has to be properly controlled to insure a good energy performance while maintaining a satisfactory thermal comfort for building’s occupant. In the case of passive discharge storages, defining in advance the required amount of energy is required to avoid overheating in the building. Consequently, anticipatory supervisory control strategies have been developed forecasting future energy demand and production to coordinate systems. Anticipatory supervisory control strategies are based on some predictions, mainly of the weather forecast. However, if the forecasted hourly outdoor temperature may be found online with a high accuracy, solar radiations predictions are most of the time not available online. To estimate them, this paper proposes an advanced approach based on the forecast of weather conditions. Several methods to correlate hourly weather conditions forecast to real hourly solar radiations are compared. Results show that using weather conditions forecast allows estimating with an acceptable accuracy solar radiations of the next day. Moreover, this technique allows obtaining hourly data that may be used for building models. As a result, this solar radiation prediction model may help to implement model-based controller as Model Predictive Control.

Keywords: anticipatory control, model predictive control, solar radiation forecast, thermal storage

Procedia PDF Downloads 252
694 Review on Rainfall Prediction Using Machine Learning Technique

Authors: Prachi Desai, Ankita Gandhi, Mitali Acharya

Abstract:

Rainfall forecast is mainly used for predictions of rainfall in a specified area and determining their future rainfall conditions. Rainfall is always a global issue as it affects all major aspects of one's life. Agricultural, fisheries, forestry, tourism industry and other industries are widely affected by these conditions. The studies have resulted in insufficient availability of water resources and an increase in water demand in the near future. We already have a new forecast system that uses the deep Convolutional Neural Network (CNN) to forecast monthly rainfall and climate changes. We have also compared CNN against Artificial Neural Networks (ANN). Machine Learning techniques that are used in rainfall predictions include ARIMA Model, ANN, LR, SVM etc. The dataset on which we are experimenting is gathered online over the year 1901 to 20118. Test results have suggested more realistic improvements than conventional rainfall forecasts.

Keywords: ANN, CNN, supervised learning, machine learning, deep learning

Procedia PDF Downloads 163
693 Forecasting Amman Stock Market Data Using a Hybrid Method

Authors: Ahmad Awajan, Sadam Al Wadi

Abstract:

In this study, a hybrid method based on Empirical Mode Decomposition and Holt-Winter (EMD-HW) is used to forecast Amman stock market data. First, the data are decomposed by EMD method into Intrinsic Mode Functions (IMFs) and residual components. Then, all components are forecasted by HW technique. Finally, forecasting values are aggregated together to get the forecasting value of stock market data. Empirical results showed that the EMD- HW outperform individual forecasting models. The strength of this EMD-HW lies in its ability to forecast non-stationary and non- linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy comparing with eight existing forecasting methods based on the five forecast error measures.

Keywords: Holt-Winter method, empirical mode decomposition, forecasting, time series

Procedia PDF Downloads 102
692 Short-Term Forecast of Wind Turbine Production with Machine Learning Methods: Direct Approach and Indirect Approach

Authors: Mamadou Dione, Eric Matzner-lober, Philippe Alexandre

Abstract:

The Energy Transition Act defined by the French State has precise implications on Renewable Energies, in particular on its remuneration mechanism. Until then, a purchase obligation contract permitted the sale of wind-generated electricity at a fixed rate. Tomorrow, it will be necessary to sell this electricity on the Market (at variable rates) before obtaining additional compensation intended to reduce the risk. This sale on the market requires to announce in advance (about 48 hours before) the production that will be delivered on the network, so to be able to predict (in the short term) this production. The fundamental problem remains the variability of the Wind accentuated by the geographical situation. The objective of the project is to provide, every day, short-term forecasts (48-hour horizon) of wind production using weather data. The predictions of the GFS model and those of the ECMWF model are used as explanatory variables. The variable to be predicted is the production of a wind farm. We do two approaches: a direct approach that predicts wind generation directly from weather data, and an integrated approach that estimâtes wind from weather data and converts it into wind power by power curves. We used machine learning techniques to predict this production. The models tested are random forests, CART + Bagging, CART + Boosting, SVM (Support Vector Machine). The application is made on a wind farm of 22MW (11 wind turbines) of the Compagnie du Vent (that became Engie Green France). Our results are very conclusive compared to the literature.

Keywords: forecast aggregation, machine learning, spatio-temporal dynamics modeling, wind power forcast

Procedia PDF Downloads 194
691 Empirical Study of Correlation between the Cost Performance Index Stability and the Project Cost Forecast Accuracy in Construction Projects

Authors: Amin AminiKhafri, James M. Dawson-Edwards, Ryan M. Simpson, Simaan M. AbouRizk

Abstract:

Earned value management (EVM) has been introduced as an integrated method to combine schedule, budget, and work breakdown structure (WBS). EVM provides various indices to demonstrate project performance including the cost performance index (CPI). CPI is also used to forecast final project cost at completion based on the cost performance during the project execution. Knowing the final project cost during execution can initiate corrective actions, which can enhance project outputs. CPI, however, is not constant during the project, and calculating the final project cost using a variable index is an inaccurate and challenging task for practitioners. Since CPI is based on the cumulative progress values and because of the learning curve effect, CPI variation dampens and stabilizes as project progress. Although various definitions for the CPI stability have been proposed in literature, many scholars have agreed upon the definition that considers a project as stable if the CPI at 20% completion varies less than 0.1 from the final CPI. While 20% completion point is recognized as the stability point for military development projects, construction projects stability have not been studied. In the current study, an empirical study was first conducted using construction project data to determine the stability point for construction projects. Early findings have demonstrated that a majority of construction projects stabilize towards completion (i.e., after 70% completion point). To investigate the effect of CPI stability on cost forecast accuracy, the correlation between CPI stability and project cost at completion forecast accuracy was also investigated. It was determined that as projects progress closer towards completion, variation of the CPI decreases and final project cost forecast accuracy increases. Most projects were found to have 90% accuracy in the final cost forecast at 70% completion point, which is inlined with findings from the CPI stability findings. It can be concluded that early stabilization of the project CPI results in more accurate cost at completion forecasts.

Keywords: cost performance index, earned value management, empirical study, final project cost

Procedia PDF Downloads 133
690 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models

Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg

Abstract:

Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.

Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction

Procedia PDF Downloads 289
689 Banking and Accounting Analysis Researches Effect on Environment

Authors: Marina Magdy Naguib Karas

Abstract:

New methods of providing banking services to the customer have been introduced, such as online banking. Banks have begun to consider electronic banking (e-banking) as a way to replace some traditional branch functions by using the Internet as a new distribution channel. Some consumers have at least one account at multiple banks and access these accounts through online banking. To check their current net worth, clients need to log into each of their accounts, get detailed information, and work toward consolidation. Not only is it time-consuming, but it is also a repeatable activity with a certain frequency. To solve this problem, the concept of account aggregation was added as a solution. Account consolidation in e-banking as a form of electronic banking appears to build a stronger relationship with customers. An account linking service is generally referred to as a service that allows customers to manage their bank accounts held at different institutions via a common online banking platform that places a high priority on security and data protection. The article provides an overview of the account aggregation approach in e-banking as a new service in the area of e-banking.

Keywords: compatibility, complexity, mobile banking, observation, risk banking technology, Internet banks, modernization of banks, banks, account aggregation, security, enterprise development

Procedia PDF Downloads 20
688 Banking and Accounting Analysis Researches Effect on Environment and Income

Authors: Gerges Samaan Henin Abdalla

Abstract:

New methods of providing banking services to the customer have been introduced, such as online banking. Banks have begun to consider electronic banking (e-banking) as a way to replace some traditional branch functions by using the Internet as a new distribution channel. Some consumers have at least one account at multiple banks and access these accounts through online banking. To check their current net worth, clients need to log into each of their accounts, get detailed information, and work toward consolidation. Not only is it time consuming, but it is also a repeatable activity with a certain frequency. To solve this problem, the concept of account aggregation was added as a solution. Account consolidation in e-banking as a form of electronic banking appears to build a stronger relationship with customers. An account linking service is generally referred to as a service that allows customers to manage their bank accounts held at different institutions via a common online banking platform that places a high priority on security and data protection. The article provides an overview of the account aggregation approach in e-banking as a new service in the area of e-banking.

Keywords: compatibility, complexity, mobile banking, observation, risk banking technology, Internet banks, modernization of banks, banks, account aggregation, security, enterprise development

Procedia PDF Downloads 30
687 Application of Grey Theory in the Forecast of Facility Maintenance Hours for Office Building Tenants and Public Areas

Authors: Yen Chia-Ju, Cheng Ding-Ruei

Abstract:

This study took case office building as subject and explored the responsive work order repair request of facilities and equipment in offices and public areas by gray theory, with the purpose of providing for future related office building owners, executive managers, property management companies, mechanical and electrical companies as reference for deciding and assessing forecast model. Important conclusions of this study are summarized as follows according to the study findings: 1. Grey Relational Analysis discusses the importance of facilities repair number of six categories, namely, power systems, building systems, water systems, air conditioning systems, fire systems and manpower dispatch in order. In terms of facilities maintenance importance are power systems, building systems, water systems, air conditioning systems, manpower dispatch and fire systems in order. 2. GM (1,N) and regression method took maintenance hours as dependent variables and repair number, leased area and tenants number as independent variables and conducted single month forecast based on 12 data from January to December 2011. The mean absolute error and average accuracy of GM (1,N) from verification results were 6.41% and 93.59%; the mean absolute error and average accuracy of regression model were 4.66% and 95.34%, indicating that they have highly accurate forecast capability.

Keywords: rey theory, forecast model, Taipei 101, office buildings, property management, facilities, equipment

Procedia PDF Downloads 415
686 IPO Valuation and Profitability Expectations: Evidence from the Italian Exchange

Authors: Matteo Bonaventura, Giancarlo Giudici

Abstract:

This paper analyses the valuation process of companies listed on the Italian Exchange in the period 2000-2009 at their Initial Public Offering (IPO). One the most common valuation techniques declared in the IPO prospectus to determine the offer price is the Discounted Cash Flow (DCF) method. We develop a ‘reverse engineering’ model to discover the short term profitability implied in the offer prices. We show that there is a significant optimistic bias in the estimation of future profitability compared to ex-post actual realization and the mean forecast error is substantially large. Yet we show that such error characterizes also the estimations carried out by analysts evaluating non-IPO companies. The forecast error is larger the faster has been the recent growth of the company, the higher is the leverage of the IPO firm, the more companies issued equity on the market. IPO companies generally exhibit better operating performance before the listing, with respect to comparable listed companies, while after the flotation they do not perform significantly different in term of return on invested capital. Pre-IPO book building activity plays a significant role in partially reducing the forecast error and revising expectations, while the market price of the first day of trading does not contain information for further reducing forecast errors.

Keywords: initial public offerings, DCF, book building, post-IPO profitability drop

Procedia PDF Downloads 327
685 Wind Power Forecast Error Simulation Model

Authors: Josip Vasilj, Petar Sarajcev, Damir Jakus

Abstract:

One of the major difficulties introduced with wind power penetration is the inherent uncertainty in production originating from uncertain wind conditions. This uncertainty impacts many different aspects of power system operation, especially the balancing power requirements. For this reason, in power system development planing, it is necessary to evaluate the potential uncertainty in future wind power generation. For this purpose, simulation models are required, reproducing the performance of wind power forecasts. This paper presents a wind power forecast error simulation models which are based on the stochastic process simulation. Proposed models capture the most important statistical parameters recognized in wind power forecast error time series. Furthermore, two distinct models are presented based on data availability. First model uses wind speed measurements on potential or existing wind power plant locations, while the seconds model uses statistical distribution of wind speeds.

Keywords: wind power, uncertainty, stochastic process, Monte Carlo simulation

Procedia PDF Downloads 457
684 Automatic Aggregation and Embedding of Microservices for Optimized Deployments

Authors: Pablo Chico De Guzman, Cesar Sanchez

Abstract:

Microservices are a software development methodology in which applications are built by composing a set of independently deploy-able, small, modular services. Each service runs a unique process and it gets instantiated and deployed in one or more machines (we assume that different microservices are deployed into different machines). Microservices are becoming the de facto standard for developing distributed cloud applications due to their reduced release cycles. In principle, the responsibility of a microservice can be as simple as implementing a single function, which can lead to the following issues: - Resource fragmentation due to the virtual machine boundary. - Poor communication performance between microservices. Two composition techniques can be used to optimize resource fragmentation and communication performance: aggregation and embedding of microservices. Aggregation allows the deployment of a set of microservices on the same machine using a proxy server. Aggregation helps to reduce resource fragmentation, and is particularly useful when the aggregated services have a similar scalability behavior. Embedding deals with communication performance by deploying on the same virtual machine those microservices that require a communication channel (localhost bandwidth is reported to be about 40 times faster than cloud vendor local networks and it offers better reliability). Embedding can also reduce dependencies on load balancer services since the communication takes place on a single virtual machine. For example, assume that microservice A has two instances, a1 and a2, and it communicates with microservice B, which also has two instances, b1 and b2. One embedding can deploy a1 and b1 on machine m1, and a2 and b2 are deployed on a different machine m2. This deployment configuration allows each pair (a1-b1), (a2-b2) to communicate using the localhost interface without the need of a load balancer between microservices A and B. Aggregation and embedding techniques are complex since different microservices might have incompatible runtime dependencies which forbid them from being installed on the same machine. There is also a security concern since the attack surface between microservices can be larger. Luckily, container technology allows to run several processes on the same machine in an isolated manner, solving the incompatibility of running dependencies and the previous security concern, thus greatly simplifying aggregation/embedding implementations by just deploying a microservice container on the same machine as the aggregated/embedded microservice container. Therefore, a wide variety of deployment configurations can be described by combining aggregation and embedding to create an efficient and robust microservice architecture. This paper presents a formal method that receives a declarative definition of a microservice architecture and proposes different optimized deployment configurations by aggregating/embedding microservices. The first prototype is based on i2kit, a deployment tool also submitted to ICWS 2018. The proposed prototype optimizes the following parameters: network/system performance, resource usage, resource costs and failure tolerance.

Keywords: aggregation, deployment, embedding, resource allocation

Procedia PDF Downloads 177
683 Forecasting Nokoué Lake Water Levels Using Long Short-Term Memory Network

Authors: Namwinwelbere Dabire, Eugene C. Ezin, Adandedji M. Firmin

Abstract:

The prediction of hydrological flows (rainfall-depth or rainfall-discharge) is becoming increasingly important in the management of hydrological risks such as floods. In this study, the Long Short-Term Memory (LSTM) network, a state-of-the-art algorithm dedicated to time series, is applied to predict the daily water level of Nokoue Lake in Benin. This paper aims to provide an effective and reliable method enable of reproducing the future daily water level of Nokoue Lake, which is influenced by a combination of two phenomena: rainfall and river flow (runoff from the Ouémé River, the Sô River, the Porto-Novo lagoon, and the Atlantic Ocean). Performance analysis based on the forecasting horizon indicates that LSTM can predict the water level of Nokoué Lake up to a forecast horizon of t+10 days. Performance metrics such as Root Mean Square Error (RMSE), coefficient of correlation (R²), Nash-Sutcliffe Efficiency (NSE), and Mean Absolute Error (MAE) agree on a forecast horizon of up to t+3 days. The values of these metrics remain stable for forecast horizons of t+1 days, t+2 days, and t+3 days. The values of R² and NSE are greater than 0.97 during the training and testing phases in the Nokoué Lake basin. Based on the evaluation indices used to assess the model's performance for the appropriate forecast horizon of water level in the Nokoué Lake basin, the forecast horizon of t+3 days is chosen for predicting future daily water levels.

Keywords: forecasting, long short-term memory cell, recurrent artificial neural network, Nokoué lake

Procedia PDF Downloads 36
682 Load Forecast of the Peak Demand Based on Both the Peak Demand and Its Location

Authors: Qais H. Alsafasfeh

Abstract:

The aim of this paper is to provide a forecast of the peak demand for the next 15 years for electrical distribution companies. The proposed methodology provides both the peak demand and its location for the next 15 years. This paper describes the Spatial Load Forecasting model used, the information provided by electrical distribution company in Jordan, the workflow followed, the parameters used and the assumptions made to run the model. The aim of this paper is to provide a forecast of the peak demand for the next 15 years for electrical distribution companies. The proposed methodology provides both the peak demand and its location for the next 15 years. This paper describes the Spatial Load Forecasting model used, the information provided by electrical distribution company in Jordan, the workflow followed, the parameters used and the assumptions made to run the model.

Keywords: load forecast, peak demand, spatial load, electrical distribution

Procedia PDF Downloads 467
681 Deployment of Beyond 4G Wireless Communication Networks with Carrier Aggregation

Authors: Bahram Khan, Anderson Rocha Ramos, Rui R. Paulo, Fernando J. Velez

Abstract:

With the growing demand for a new blend of applications, the users dependency on the internet is increasing day by day. Mobile internet users are giving more attention to their own experiences, especially in terms of communication reliability, high data rates and service stability on move. This increase in the demand is causing saturation of existing radio frequency bands. To address these challenges, researchers are investigating the best approaches, Carrier Aggregation (CA) is one of the newest innovations, which seems to fulfill the demands of the future spectrum, also CA is one the most important feature for Long Term Evolution - Advanced (LTE-Advanced). For this purpose to get the upcoming International Mobile Telecommunication Advanced (IMT-Advanced) mobile requirements (1 Gb/s peak data rate), the CA scheme is presented by 3GPP, which would sustain a high data rate using widespread frequency bandwidth up to 100 MHz. Technical issues such as aggregation structure, its implementations, deployment scenarios, control signal techniques, and challenges for CA technique in LTE-Advanced, with consideration of backward compatibility, are highlighted in this paper. Also, performance evaluation in macro-cellular scenarios through a simulation approach is presented, which shows the benefits of applying CA, low-complexity multi-band schedulers in service quality, system capacity enhancement and concluded that enhanced multi-band scheduler is less complex than the general multi-band scheduler, which performs better for a cell radius longer than 1800 m (and a PLR threshold of 2%).

Keywords: component carrier, carrier aggregation, LTE-advanced, scheduling

Procedia PDF Downloads 163
680 SiamMask++: More Accurate Object Tracking through Layer Wise Aggregation in Visual Object Tracking

Authors: Hyunbin Choi, Jihyeon Noh, Changwon Lim

Abstract:

In this paper, we propose SiamMask++, an architecture that performs layer-wise aggregation and depth-wise cross-correlation and introduce multi-RPN module and multi-MASK module to improve EAO (Expected Average Overlap), a representative performance evaluation metric for Visual Object Tracking (VOT) challenge. The proposed architecture, SiamMask++, has two versions, namely, bi_SiamMask++, which satisfies the real time (56fps) on systems equipped with GPUs (Titan XP), and rf_SiamMask++, which combines mask refinement modules for EAO improvements. Tests are performed on VOT2016, VOT2018 and VOT2019, the representative datasets of Visual Object Tracking tasks labeled as rotated bounding boxes. SiamMask++ perform better than SiamMask on all the three datasets tested. SiamMask++ is achieved performance of 62.6% accuracy, 26.2% robustness and 39.8% EAO, especially on the VOT2018 dataset. Compared to SiamMask, this is an improvement of 4.18%, 37.17%, 23.99%, respectively. In addition, we do an experimental in-depth analysis of how much the introduction of features and multi modules extracted from the backbone affects the performance of our model in the VOT task.

Keywords: visual object tracking, video, deep learning, layer wise aggregation, Siamese network

Procedia PDF Downloads 120
679 Decentralized Peak-Shaving Strategies for Integrated Domestic Batteries

Authors: Corentin Jankowiak, Aggelos Zacharopoulos, Caterina Brandoni

Abstract:

In a context of increasing stress put on the electricity network by the decarbonization of many sectors, energy storage is likely to be the key mitigating element, by acting as a buffer between production and demand. In particular, the highest potential for storage is when connected closer to the loads. Yet, low voltage storage struggles to penetrate the market at a large scale due to the novelty and complexity of the solution, and the competitive advantage of fossil fuel-based technologies regarding regulations. Strong and reliable numerical simulations are required to show the benefits of storage located near loads and promote its development. The present study was restrained from excluding aggregated control of storage: it is assumed that the storage units operate independently to one another without exchanging information – as is currently mostly the case. A computationally light battery model is presented in detail and validated by direct comparison with a domestic battery operating in real conditions. This model is then used to develop Peak-Shaving (PS) control strategies as it is the decentralized service from which beneficial impacts are most likely to emerge. The aggregation of flatter, peak- shaved consumption profiles is likely to lead to flatter and arbitraged profile at higher voltage layers. Furthermore, voltage fluctuations can be expected to decrease if spikes of individual consumption are reduced. The crucial part to achieve PS lies in the charging pattern: peaks depend on the switching on and off of appliances in the dwelling by the occupants and are therefore impossible to predict accurately. A performant PS strategy must, therefore, include a smart charge recovery algorithm that can ensure enough energy is present in the battery in case it is needed without generating new peaks by charging the unit. Three categories of PS algorithms are introduced in detail. First, using a constant threshold or power rate for charge recovery, followed by algorithms using the State Of Charge (SOC) as a decision variable. Finally, using a load forecast – of which the impact of the accuracy is discussed – to generate PS. A performance metrics was defined in order to quantitatively evaluate their operating regarding peak reduction, total energy consumption, and self-consumption of domestic photovoltaic generation. The algorithms were tested on load profiles with a 1-minute granularity over a 1-year period, and their performance was assessed regarding these metrics. The results show that constant charging threshold or power are far from optimal: a certain value is not likely to fit the variability of a residential profile. As could be expected, forecast-based algorithms show the highest performance. However, these depend on the accuracy of the forecast. On the other hand, SOC based algorithms also present satisfying performance, making them a strong alternative when the reliable forecast is not available.

Keywords: decentralised control, domestic integrated batteries, electricity network performance, peak-shaving algorithm

Procedia PDF Downloads 98
678 Impact of Climate on Sugarcane Yield Over Belagavi District, Karnataka Using Statistical Mode

Authors: Girish Chavadappanavar

Abstract:

The impact of climate on agriculture could result in problems with food security and may threaten the livelihood activities upon which much of the population depends. In the present study, the development of a statistical yield forecast model has been carried out for sugarcane production over Belagavi district, Karnataka using weather variables of crop growing season and past observed yield data for the period of 1971 to 2010. The study shows that this type of statistical yield forecast model could efficiently forecast yield 5 weeks and even 10 weeks in advance of the harvest for sugarcane within an acceptable limit of error. The performance of the model in predicting yields at the district level for sugarcane crops is found quite satisfactory for both validation (2007 and 2008) as well as forecasting (2009 and 2010).In addition to the above study, the climate variability of the area has also been studied, and hence, the data series was tested for Mann Kendall Rank Statistical Test. The maximum and minimum temperatures were found to be significant with opposite trends (decreasing trend in maximum and increasing in minimum temperature), while the other three are found in significant with different trends (rainfall and evening time relative humidity with increasing trend and morning time relative humidity with decreasing trend).

Keywords: climate impact, regression analysis, yield and forecast model, sugar models

Procedia PDF Downloads 42
677 Time Series Modelling for Forecasting Wheat Production and Consumption of South Africa in Time of War

Authors: Yiseyon Hosu, Joseph Akande

Abstract:

Wheat is one of the most important staple food grains of human for centuries and is largely consumed in South Africa. It has a special place in the South African economy because of its significance in food security, trade, and industry. This paper modelled and forecast the production and consumption of wheat in South Africa in the time covid-19 and the ongoing Russia-Ukraine war by using annual time series data from 1940–2021 based on the ARIMA models. Both the averaging forecast and selected models forecast indicate that there is the possibility of an increase with respect to production. The minimum and maximum growth in production is projected to be between 3million and 10 million tons, respectively. However, the model also forecast a possibility of depression with respect to consumption in South Africa. Although Covid-19 and the war between Ukraine and Russia, two major producers and exporters of global wheat, are having an effect on the volatility of the prices currently, the wheat production in South African is expected to increase and meat the consumption demand and provided an opportunity for increase export with respect to domestic consumption. The forecasting of production and consumption behaviours of major crops play an important role towards food and nutrition security, these findings can assist policymakers and will provide them with insights into the production and pricing policy of wheat in South Africa.

Keywords: ARIMA, food security, price volatility, staple food, South Africa

Procedia PDF Downloads 73
676 Mitigating the Aggregation of Human Islet Amyloid Polypeptide with Nanomaterials

Authors: Ava Faridi, Pouya Faridi, Aleksandr Kakinen, Ibrahim Javed, Thomas P. Davis, Pu Chun Ke

Abstract:

Human islet amyloid polypeptide (IAPP) is a hormone associated with glycemic control and type 2 diabetes. Biophysically, the chirality of IAPP fibrils has been little explored with respect to the aggregation and toxicity of the peptide. Biochemically, it remains unclear as for how protein expression in pancreatic beta cells may be altered by cell exposure to the peptide, and how such changes may be mitigated by nanoparticle inhibitors for IAPP aggregation. In this study, we first demonstrated the elimination of the IAPP nucleation phase and shortening of its elongation phase by silica nanoribbons. This accelerated IAPP fibrillization translated to reduced toxicity, especially for the right-handed silica nanoribbons, as revealed by cell viability, helium ion microscopy, as well as zebrafish embryo survival, developmental and behavioral assays. We then examined the proteomes of βTC6 pancreatic beta cells exposed to the three main aggregation states of monomeric, oligomeric and amyloid fibrillar IAPP, and compared that with cellular protein expression modulated by graphene quantum dots (GQDs). A total of 29 proteins were significantly regulated by different forms of IAPP, and the majority of these proteins were nucleotide-binding proteins. A regulatory capacity of GQDs against aberrant protein expression was confirmed. These studies have demonstrated the great potential of employing nanomaterials targeting the mesoscopic enantioselectivity and protein expression dysregulation in pancreatic beta cells.

Keywords: graphene quantum dots, IAPP, silica nanoribbons, protein expression, toxicity

Procedia PDF Downloads 115
675 Objective-Based System Dynamics Modeling to Forecast the Number of Health Professionals in Pudong New Area of Shanghai

Authors: Jie Ji, Jing Xu, Yuehong Zhuang, Xiangqing Kang, Ying Qian, Ping Zhou, Di Xue

Abstract:

Background: In 2014, there were 28,341 health professionals in Pudong new area of Shanghai and the number per 1000 population was 5.199, 55.55% higher than that in 2006. But it was always less than the average number of health professionals per 1000 population in Shanghai from 2006 to 2014. Therefore, allocation planning for the health professionals in Pudong new area has become a high priority task in order to meet the future demands of health care. In this study, we constructed an objective-based system dynamics model to forecast the number of health professionals in Pudong new area of Shanghai in 2020. Methods: We collected the data from health statistics reports and previous survey of human resources in Pudong new area of Shanghai. Nine experts, who were from health administrative departments, public hospitals and community health service centers, were consulted to estimate the current and future status of nine variables used in the system dynamics model. Based on the objective of the number of health professionals per 1000 population (8.0) in Shanghai for 2020, the system dynamics model for health professionals in Pudong new area of Shanghai was constructed to forecast the number of health professionals needed in Pudong new area in 2020. Results: The system dynamics model for health professionals in Pudong new area of Shanghai was constructed. The model forecasted that there will be 37,330 health professionals (6.433 per 1000 population) in 2020. If the success rate of health professional recruitment changed from 20% to 70%, the number of health professionals per 1000 population would be changed from 5.269 to 6.919. If this rate changed from 20% to 70% and the success rate of building new beds changed from 5% to 30% at the same time, the number of health professionals per 1000 population would be changed from 5.269 to 6.923. Conclusions: The system dynamics model could be used to simulate and forecast the health professionals. But, if there were no significant changes in health policies and management system, the number of health professionals per 1000 population would not reach the objectives in Pudong new area in 2020.

Keywords: allocation planning, forecast, health professional, system dynamics

Procedia PDF Downloads 360
674 Preventing Neurodegenerative Diseases by Stabilization of Superoxide Dismutase by Natural Polyphenolic Compounds

Authors: Danish Idrees, Vijay Kumar, Samudrala Gourinath

Abstract:

Amyotrophic lateral sclerosis (ALS) is a neurodegenerative disease caused by misfolding and aggregation of Cu, Zn superoxide dismutase (SOD1). The use of small molecules has been shown to stabilize the SOD1 dimer and preventing its dissociation and aggregation. In this study, we employed molecular docking, molecular dynamics simulation and surface plasmon resonance (SPR) to study the interactions between SOD1 and natural polyphenolic compounds. In order to explore the noncovalent interaction between SOD1 and natural polyphenolic compounds, molecular docking and molecular dynamic (MD) simulations were employed to gain insights into the binding modes and free energies of SOD1-polyphenolic compounds. MM/PBSA methods were used to calculate free energies from obtained MD trajectories. The compounds, Hesperidin, Ergosterol, and Rutin showed the excellent binding affinity in micromolar range with SOD1. Ergosterol and Hesperidin have the strongest binding affinity to SOD1 and was subjected to further characterization. Biophysical experiments using Circular Dichroism and Thioflavin T fluorescence spectroscopy results show that the binding of these two compounds can stabilize SOD1 dimer and inhibit the aggregation of SOD1. Molecular simulation results also suggest that these compounds reduce the dissociation of SOD1 dimers through direct interaction with the dimer interface. This study will be helpful to develop other drug-like molecules which may have the effect to reduce the aggregation of SOD1.

Keywords: amyotrophic lateral sclerosis, molecular dynamics simulation, surface plasmon resonance, superoxide dismutase

Procedia PDF Downloads 115
673 Effect of Alkalinity of Water on the Aggregation of Colloidal Silver Nanoparticles

Authors: Fedda Y. Alzoubi, Ihsan A. Aljarrah

Abstract:

Silver nanoparticles (AgNPs) are one of the most vital and fascinating nanomaterials among several metallic nanoparticles that are involved in different applications, especially in biomedical applications. Samples of different alkaline water were prepared in order to study the effect of alkalinity of water on the optical properties, size, and morphology of colloidal AgNPs prepared according to the chemical reduction method using the prepared water samples. Ultraviolet-Visible spectrophotometer, Zeta-sizer, and Scanning electron microscope (SEM) have been utilized to carry out this study. Absorption spectra AgNPs in different alkaline water show a surface Plasmon resonance (SPR) peak at the wavelength of 420 nm. The position of this peak is sensitive to the shape of the particles, and in our case, it indicates that the particles are spherical. As the alkalinity increases, the intensity of the SPR peak decreases, indicating the aggregation of particles. Zeta-sizer measurements show that the average diameter for AgNPs in pure water is found to be 53.51 nm, and this value increases as the alkalinity increases. Zeta potential values of samples show that the negatively coated particles are stable in the solution. SEM images insure the spherical shape of the prepared nanoparticles and show that as the alkalinity increases the particles aggregate into larger particles.

Keywords: aggregation, alkalinity, colloid, nanoparticle

Procedia PDF Downloads 106
672 The Role of Piceatannol in Counteracting Glyceraldehyde-3-Phosphate Dehydrogenase Aggregation and Nuclear Translocation

Authors: Joanna Gerszon, Aleksandra Rodacka

Abstract:

In the pathogenesis of neurodegenerative diseases such as Alzheimer's disease and Parkinson's disease, protein and peptide aggregation processes play a vital role in contributing to the formation of intracellular and extracellular protein deposits. One of the major components of these deposits is the oxidatively modified glyceraldehyde-3-phosphate dehydrogenase (GAPDH). Therefore, the purpose of this research was to answer the question whether piceatannol, a stilbene derivative, counteracts and/or slows down oxidative stress-induced GAPDH aggregation. The study also aimed to determine if this natural occurring compound prevents unfavorable nuclear translocation of GAPDH in hippocampal cells. The isothermal titration calorimetry (ITC) analysis indicated that one molecule of GAPDH can bind up to 8 molecules of piceatannol (7.3 ± 0.9). As a consequence of piceatannol binding to the enzyme, the loss of activity was observed. Parallel with GAPDH inactivation the changes in zeta potential, and loss of free thiol groups were noted. Nevertheless, the ligand-protein binding does not influence the secondary structure of the GAPDH. Precise molecular docking analysis of the interactions inside the active center allowed to presume that these effects are due to piceatannol ability to assemble a covalent binding with nucleophilic cysteine residue (Cys149) which is directly involved in the catalytic reaction. Molecular docking also showed that simultaneously 11 molecules of ligand can be bound to dehydrogenase. Taking into consideration obtained data, the influence of piceatannol on level of GAPDH aggregation induced by excessive oxidative stress was examined. The applied methods (thioflavin-T binding-dependent fluorescence as well as microscopy methods - transmission electron microscopy, Congo Red staining) revealed that piceatannol significantly diminishes level of GAPDH aggregation. Finally, studies involving cellular model (Western blot analyses of nuclear and cytosolic fractions and confocal microscopy) indicated that piceatannol-GAPDH binding prevents GAPDH from nuclear translocation induced by excessive oxidative stress in hippocampal cells. In consequence, it counteracts cell apoptosis. These studies demonstrate that by binding with GAPDH, piceatannol blocks cysteine residue and counteracts its oxidative modifications, that induce oligomerization and GAPDH aggregation as well as it prevents hippocampal cells from apoptosis by retaining GAPDH in the cytoplasm. All these findings provide a new insight into the role of piceatannol interaction with GAPDH and present a potential therapeutic strategy for some neurological disorders related to GAPDH aggregation. This work was supported by the by National Science Centre, Poland (grant number 2017/25/N/NZ1/02849).

Keywords: glyceraldehyde-3-phosphate dehydrogenase, neurodegenerative disease, neuroprotection, piceatannol, protein aggregation

Procedia PDF Downloads 140
671 Multi Tier Data Collection and Estimation, Utilizing Queue Model in Wireless Sensor Networks

Authors: Amirhossein Mohajerzadeh, Abolghasem Mohajerzadeh

Abstract:

In this paper, target parameter is estimated with desirable precision in hierarchical wireless sensor networks (WSN) while the proposed algorithm also tries to prolong network lifetime as much as possible, using efficient data collecting algorithm. Target parameter distribution function is considered unknown. Sensor nodes sense the environment and send the data to the base station called fusion center (FC) using hierarchical data collecting algorithm. FC builds underlying phenomena based on collected data. Considering the aggregation level, x, the goal is providing the essential infrastructure to find the best value for aggregation level in order to prolong network lifetime as much as possible, while desirable accuracy is guaranteed (required sample size is fully depended on desirable precision). First, the sample size calculation algorithm is discussed, second, the average queue length based on M/M[x]/1/K queue model is determined and it is used for energy consumption calculation. Nodes can decrease transmission cost by aggregating incoming data. Furthermore, the performance of the new algorithm is evaluated in terms of lifetime and estimation accuracy.

Keywords: aggregation, estimation, queuing, wireless sensor network

Procedia PDF Downloads 164
670 D-Wave Quantum Computing Ising Model: A Case Study for Forecasting of Heat Waves

Authors: Dmytro Zubov, Francesco Volponi

Abstract:

In this paper, D-Wave quantum computing Ising model is used for the forecasting of positive extremes of daily mean air temperature. Forecast models are designed with two to five qubits, which represent 2-, 3-, 4-, and 5-day historical data respectively. Ising model’s real-valued weights and dimensionless coefficients are calculated using daily mean air temperatures from 119 places around the world, as well as sea level (Aburatsu, Japan). In comparison with current methods, this approach is better suited to predict heat wave values because it does not require the estimation of a probability distribution from scarce observations. Proposed forecast quantum computing algorithm is simulated based on traditional computer architecture and combinatorial optimization of Ising model parameters for the Ronald Reagan Washington National Airport dataset with 1-day lead-time on learning sample (1975-2010 yr). Analysis of the forecast accuracy (ratio of successful predictions to total number of predictions) on the validation sample (2011-2014 yr) shows that Ising model with three qubits has 100 % accuracy, which is quite significant as compared to other methods. However, number of identified heat waves is small (only one out of nineteen in this case). Other models with 2, 4, and 5 qubits have 20 %, 3.8 %, and 3.8 % accuracy respectively. Presented three-qubit forecast model is applied for prediction of heat waves at other five locations: Aurel Vlaicu, Romania – accuracy is 28.6 %; Bratislava, Slovakia – accuracy is 21.7 %; Brussels, Belgium – accuracy is 33.3 %; Sofia, Bulgaria – accuracy is 50 %; Akhisar, Turkey – accuracy is 21.4 %. These predictions are not ideal, but not zeros. They can be used independently or together with other predictions generated by different method(s). The loss of human life, as well as environmental, economic, and material damage, from extreme air temperatures could be reduced if some of heat waves are predicted. Even a small success rate implies a large socio-economic benefit.

Keywords: heat wave, D-wave, forecast, Ising model, quantum computing

Procedia PDF Downloads 471
669 Forecasting the Influences of Information and Communication Technology on the Structural Changes of Japanese Industrial Sectors: A Study Using Statistical Analysis

Authors: Ubaidillah Zuhdi, Shunsuke Mori, Kazuhisa Kamegai

Abstract:

The purpose of this study is to forecast the influences of Information and Communication Technology (ICT) on the structural changes of Japanese economies based on Leontief Input-Output (IO) coefficients. This study establishes a statistical analysis to predict the future interrelationships among industries. We employ the Constrained Multivariate Regression (CMR) model to analyze the historical changes of input-output coefficients. Statistical significance of the model is then tested by Likelihood Ratio Test (LRT). In our model, ICT is represented by two explanatory variables, i.e. computers (including main parts and accessories) and telecommunications equipment. A previous study, which analyzed the influences of these variables on the structural changes of Japanese industrial sectors from 1985-2005, concluded that these variables had significant influences on the changes in the business circumstances of Japanese commerce, business services and office supplies, and personal services sectors. The projected future Japanese economic structure based on the above forecast generates the differentiated direct and indirect outcomes of ICT penetration.

Keywords: forecast, ICT, industrial structural changes, statistical analysis

Procedia PDF Downloads 353