Search results for: forecast aggregation
663 Aggregating Buyers and Sellers for E-Commerce: How Demand and Supply Meet in Fairs
Authors: Pierluigi Gallo, Francesco Randazzo, Ignazio Gallo
Abstract:
In recent years, many new and interesting models of successful online business have been developed. Many of these are based on the competition between users, such as online auctions, where the product price is not fixed and tends to rise. Other models, including group-buying, are based on cooperation between users, characterized by a dynamic price of the product that tends to go down. There is not yet a business model in which both sellers and buyers are grouped in order to negotiate on a specific product or service. The present study investigates a new extension of the group-buying model, called fair, which allows aggregation of demand and supply for price optimization, in a cooperative manner. Additionally, our system also aggregates products and destinations for shipping optimization. We introduced the following new relevant input parameters in order to implement a double-side aggregation: (a) price-quantity curves provided by the seller; (b) waiting time, that is, the longer buyers wait, the greater discount they get; (c) payment time, which determines if the buyer pays before, during or after receiving the product; (d) the distance between the place where products are available and the place of shipment, provided in advance by the buyer or dynamically suggested by the system. To analyze the proposed model we implemented a system prototype and a simulator that allows studying effects of changing some input parameters. We analyzed the dynamic price model in fairs having one single seller and a combination of selected sellers. The results are very encouraging and motivate further investigation on this topic.Keywords: auction, aggregation, fair, group buying, social buying
Procedia PDF Downloads 294662 Forecast Combination for Asset Classes: Insights on Market Efficiency and Arbitrage
Authors: Rodrigo Baggi Prieto Alvarez, Jorge Miguel Bravo
Abstract:
The Exchange-Traded Funds (ETFs) have transformed asset allocation, allowing investors to gain exposure to diverse asset classes with a single instrument. In turn, forecast combination models have emerged as advantageous methods for improving prediction accuracy. While the Efficient Market Hypothesis (EMH) posits that prices fluctuate randomly, making abnormal returns unattainable, empirical evidence reveals autocorrelation in stock returns, challenging the EMH's strict interpretation. This raises the question of whether econometric models, machine learning methods and forecast combinations can predict asset prices more effectively. Also, comparing forecasts with futures market prices may reveal potential arbitrage opportunities, offering insights into market inefficiencies. Using ETFs indices from January 1st, 2015, to September 30th, 2024, across equity markets (S&P 500, Russell 2000, MSCI Developed Markets and MSCI Emerging Markets), fixed income (7-10 Year Treasury Bond, Developed Markets Treasury Bond, Emerging Markets Treasury Bond and U.S. Corporate Bonds), commodity (Gold Shares ETF) and crypto (ProShares Bitcoin ETF), this paper tests the predictive accuracy of traditional econometric models (ARIMA, ETS), machine learning (SVM, Random Forest, XGBoost) and forecast combinations (ARIMA-SVR, ARIMA-ANN, Ridge Regression and LASSO). Preliminary results suggest that ensemble methods can indeed outperform simple models, indicating that combinations like the Ridge Regression and LASSO are superior to econometric and machine learning models individually. Also, prediction accuracy is better for fixed income ETFs, aligned with the lower volatility of these assets, while models show higher forecast error for crypto and equity ETFs. Finally, initial comparisons between forecasts and the futures market prices reveal potential inefficiencies, suggesting opportunities for spot-futures index arbitrage. Providing empirical evidence on the application of forecasting models to a significant group of financial assets, these findings contribute to discussions on market efficiency and highlight the role of ensemble methods in improving asset price predictability and portfolio management.Keywords: ETF, asset prediction, forecast combination, EMH, spot-futures index arbitrage
Procedia PDF Downloads 4661 Distributed Framework for Pothole Detection and Monitoring Using Federated Learning
Authors: Ezil Sam Leni, Shalen S.
Abstract:
Transport service monitoring and upkeep are essential components of smart city initiatives. The main risks to the relevant departments and authorities are the ever-increasing vehicular traffic and the conditions of the roads. In India, the economy is greatly impacted by the road transport sector. In 2021, the Ministry of Road Transport and Highways Transport, Government of India, produced a report with statistical data on traffic accidents. The data included the number of fatalities, injuries, and other pertinent criteria. This study proposes a distributed infrastructure for the monitoring, detection, and reporting of potholes to the appropriate authorities. In a distributed environment, the nodes are the edge devices, and local edge servers, and global servers. The edge devices receive the initial model to be employed from the global server. The YOLOv8 model for pothole detection is used in the edge devices. The edge devices run the pothole detection model, gather the pothole images on their path, and send the updates to the nearby edge server. The local edge server selects the clients for its aggregation process, aggregates the model updates and sends the updates to the global server. The global server collects the updates from the local edge servers, performs aggregation and derives the updated model. The updated model has the information about the potholes received from the local edge servers and notifies the updates to the local edge servers and concerned authorities for monitoring and maintenance of road conditions. The entire process is implemented in FedCV distributed environment with the implementation using the client-server model and aggregation entities. After choosing the clients for its aggregation process, the local edge server gathers the model updates and transmits them to the global server. After gathering the updates from the regional edge servers, the global server aggregates them and creates the updated model. Performance indicators and the experimentation environment are assessed, discussed, and presented. Accelerometer data may be taken into consideration for improved performance in the future development of this study, in addition to the images captured from the transportation routes.Keywords: federated Learning, pothole detection, distributed framework, federated averaging
Procedia PDF Downloads 104660 Probability Fuzzy Aggregation Operators in Vehicle Routing Problem
Authors: Anna Sikharulidze, Gia Sirbiladze
Abstract:
For the evaluation of unreliability levels of movement on the closed routes in the vehicle routing problem, the fuzzy operators family is constructed. The interactions between routing factors in extreme conditions on the roads are considered. A multi-criteria decision-making model (MCDM) is constructed. Constructed aggregations are based on the Choquet integral and the associated probability class of a fuzzy measure. Propositions on the correctness of the extension are proved. Connections between the operators and the compositions of dual triangular norms are described. The conjugate connections between the constructed operators are shown. Operators reflect interactions among all the combinations of the factors in the fuzzy MCDM process. Several variants of constructed operators are used in the decision-making problem regarding the assessment of unreliability and possibility levels of movement on closed routes.Keywords: vehicle routing problem, associated probabilities of a fuzzy measure, choquet integral, fuzzy aggregation operator
Procedia PDF Downloads 326659 Forecasting Silver Commodity Prices Using Geometric Brownian Motion: A Stochastic Approach
Authors: Sina Dehghani, Zhikang Rong
Abstract:
Historically, a variety of approaches have been taken to forecast commodity prices due to the significant implications of these values on the global economy. An accurate forecasting tool for a valuable commodity would significantly benefit investors and governmental agencies. Silver, in particular, has grown significantly as a commodity in recent years due to its use in healthcare and technology. This manuscript aims to utilize the Geometric Brownian Motion predictive model to forecast silver commodity prices over multiple 3-year periods. The results of the study indicate that the model has several limitations, particularly its inability to work effectively over longer periods of time, but still was extremely effective over shorter time frames. This study sets a baseline for silver commodity forecasting with GBM, and the model could be further strengthened with refinement.Keywords: geometric Brownian motion, commodity, risk management, volatility, stochastic behavior, price forecasting
Procedia PDF Downloads 23658 A Comparative Analysis of ARIMA and Threshold Autoregressive Models on Exchange Rate
Authors: Diteboho Xaba, Kolentino Mpeta, Tlotliso Qejoe
Abstract:
This paper assesses the in-sample forecasting of the South African exchange rates comparing a linear ARIMA model and a SETAR model. The study uses a monthly adjusted data of South African exchange rates with 420 observations. Akaike information criterion (AIC) and the Schwarz information criteria (SIC) are used for model selection. Mean absolute error (MAE), root mean squared error (RMSE) and mean absolute percentage error (MAPE) are error metrics used to evaluate forecast capability of the models. The Diebold –Mariano (DM) test is employed in the study to check forecast accuracy in order to distinguish the forecasting performance between the two models (ARIMA and SETAR). The results indicate that both models perform well when modelling and forecasting the exchange rates, but SETAR seemed to outperform ARIMA.Keywords: ARIMA, error metrices, model selection, SETAR
Procedia PDF Downloads 244657 The Effect of Cigarette Smoking on the Production of 20-Hydroxyeicosatetraenoic Acid in Human Platelet
Authors: Yazun Jarrar
Abstract:
Smoking has effect on platelet aggregation and the activity of anti-platelet drugs. The chemical 20-hydroxyeicosatetraenoic acid (20-HETE) is a cardiotoxic arachidonic acid metabolite which increases platelet aggregation. In this study, we investigated the influence of cigarette smoking on 20-HETE levels and protein expression of 20-HETE producing enzyme CYP4A11 in isolated platelets from smoker and non-smoker volunteers. The protein expression and 20-HETE levels were analyzed using immunoblot and High-Performance Liquid Chromatography with Mass Spectrometry (HPL-MS) assays. The results showed that 20-HETE level was higher significantly among smokers than non-smokers (t-test, p-value<0.05). The protein expression of CYP4A11 was significantly higher (t-test, p-value<0.05) among the platelets of smokers. We concluded that cigarette smoking increased the level of platelet activator 20-HETE through increasing the protein expression of CYP4A11. These findings may increase the understanding of smoking-drug interaction during antiplatelets therapy.Keywords: smoking, 20-HETE, CYP4A11, platelet
Procedia PDF Downloads 184656 Machine Learning and Deep Learning Approach for People Recognition and Tracking in Crowd for Safety Monitoring
Authors: A. Degale Desta, Cheng Jian
Abstract:
Deep learning application in computer vision is rapidly advancing, giving it the ability to monitor the public and quickly identify potentially anomalous behaviour from crowd scenes. Therefore, the purpose of the current work is to improve the performance of safety of people in crowd events from panic behaviour through introducing the innovative idea of Aggregation of Ensembles (AOE), which makes use of the pre-trained ConvNets and a pool of classifiers to find anomalies in video data with packed scenes. According to the theory of algorithms that applied K-means, KNN, CNN, SVD, and Faster-CNN, YOLOv5 architectures learn different levels of semantic representation from crowd videos; the proposed approach leverages an ensemble of various fine-tuned convolutional neural networks (CNN), allowing for the extraction of enriched feature sets. In addition to the above algorithms, a long short-term memory neural network to forecast future feature values and a handmade feature that takes into consideration the peculiarities of the crowd to understand human behavior. On well-known datasets of panic situations, experiments are run to assess the effectiveness and precision of the suggested method. Results reveal that, compared to state-of-the-art methodologies, the system produces better and more promising results in terms of accuracy and processing speed.Keywords: action recognition, computer vision, crowd detecting and tracking, deep learning
Procedia PDF Downloads 161655 Lanthanide-Mediated Aggregation of Glutathione-Capped Gold Nanoclusters Exhibiting Strong Luminescence and Fluorescence Turn-on for Sensing Alkaline Phosphatase
Authors: Jyun-Guo You, Wei-Lung Tseng
Abstract:
Herein, this study represents a synthetic route for producing highly luminescent AuNCs based on the integration of two concepts, including thiol-induced luminescence enhancement of ligand-insufficient GSH-AuNCs and Ce3+-induced aggregation of GSH-AuNCs. The synthesis of GSH-AuNCs was conducted by modifying the previously reported procedure. To produce more Au(I)-GSH complexes on the surface of ligand-insufficient GSH-AuNCs, the extra GSH is added to attach onto the AuNC surface. The formed ligand-sufficient GSH-AuNCs (LS-GSH-AuNCs) emit relatively strong luminescence. The luminescence of LS-GSH-AuNCs is further enhanced by the coordination of two carboxylic groups (pKa1 = 2 and pKa2 = 3.5) of GSH and lanthanide ions, which induce the self-assembly of LS-GSH-AuNCs. As a result, the quantum yield of the self-assembled LS-GSH-AuNCs (SA-AuNCs) was improved to be 13%. Interestingly, the SA-AuNCs were dissembled into LS-GSH-AuNCs in the presence of adenosine triphosphate (ATP) because of the formation of the ATP- lanthanide ion complexes. Our assay was employed to detect alkaline phosphatase (ALP) activity over the range of 0.1−10 U/mL with a limit of detection (LOD) of 0.03 U/mL.Keywords: self-assembly, lanthanide ion, adenosine triphosphate, alkaline phosphatase
Procedia PDF Downloads 170654 Simulation of Red Blood Cells in Complex Micro-Tubes
Authors: Ting Ye, Nhan Phan-Thien, Chwee Teck Lim, Lina Peng, Huixin Shi
Abstract:
In biofluid flow systems, often the flow problems of fluids of complex structures, such as the flow of red blood cells (RBCs) through complex capillary vessels, need to be considered. In this paper, we aim to apply a particle-based method, Smoothed Dissipative Particle Dynamics (SDPD), to simulate the motion and deformation of RBCs in complex micro-tubes. We first present the theoretical models, including SDPD model, RBC-fluid interaction model, RBC deformation model, RBC aggregation model, and boundary treatment model. After that, we show the verification and validation of these models, by comparing our numerical results with the theoretical, experimental and previously-published numerical results. Finally, we provide some simulation cases, such as the motion and deformation of RBCs in rectangular, cylinder, curved, bifurcated, and constricted micro-tubes, respectively.Keywords: aggregation, deformation, red blood cell, smoothed dissipative particle dynamics
Procedia PDF Downloads 174653 A Scalable Media Job Framework for an Open Source Search Engine
Authors: Pooja Mishra, Chris Pollett
Abstract:
This paper explores efficient ways to implement various media-updating features like news aggregation, video conversion, and bulk email handling. All of these jobs share the property that they are periodic in nature, and they all benefit from being handled in a distributed fashion. The data for these jobs also often comes from a social or collaborative source. We isolate the class of periodic, one round map reduce jobs as a useful setting to describe and handle media updating tasks. As such tasks are simpler than general map reduce jobs, programming them in a general map reduce platform could easily become tedious. This paper presents a MediaUpdater module of the Yioop Open Source Search Engine Web Portal designed to handle such jobs via an extension of a PHP class. We describe how to implement various media-updating tasks in our system as well as experiments carried out using these implementations on an Amazon Web Services cluster.Keywords: distributed jobs framework, news aggregation, video conversion, email
Procedia PDF Downloads 299652 Downscaling Seasonal Sea Surface Temperature Forecasts over the Mediterranean Sea Using Deep Learning
Authors: Redouane Larbi Boufeniza, Jing-Jia Luo
Abstract:
This study assesses the suitability of deep learning (DL) for downscaling sea surface temperature (SST) over the Mediterranean Sea in the context of seasonal forecasting. We design a set of experiments that compare different DL configurations and deploy the best-performing architecture to downscale one-month lead forecasts of June–September (JJAS) SST from the Nanjing University of Information Science and Technology Climate Forecast System version 1.0 (NUIST-CFS1.0) for the period of 1982–2020. We have also introduced predictors over a larger area to include information about the main large-scale circulations that drive SST over the Mediterranean Sea region, which improves the downscaling results. Finally, we validate the raw model and downscaled forecasts in terms of both deterministic and probabilistic verification metrics, as well as their ability to reproduce the observed precipitation extreme and spell indicator indices. The results showed that the convolutional neural network (CNN)-based downscaling consistently improves the raw model forecasts, with lower bias and more accurate representations of the observed mean and extreme SST spatial patterns. Besides, the CNN-based downscaling yields a much more accurate forecast of extreme SST and spell indicators and reduces the significant relevant biases exhibited by the raw model predictions. Moreover, our results show that the CNN-based downscaling yields better skill scores than the raw model forecasts over most portions of the Mediterranean Sea. The results demonstrate the potential usefulness of CNN in downscaling seasonal SST predictions over the Mediterranean Sea, particularly in providing improved forecast products.Keywords: Mediterranean Sea, sea surface temperature, seasonal forecasting, downscaling, deep learning
Procedia PDF Downloads 76651 Statistical Time-Series and Neural Architecture of Malaria Patients Records in Lagos, Nigeria
Authors: Akinbo Razak Yinka, Adesanya Kehinde Kazeem, Oladokun Oluwagbenga Peter
Abstract:
Time series data are sequences of observations collected over a period of time. Such data can be used to predict health outcomes, such as disease progression, mortality, hospitalization, etc. The Statistical approach is based on mathematical models that capture the patterns and trends of the data, such as autocorrelation, seasonality, and noise, while Neural methods are based on artificial neural networks, which are computational models that mimic the structure and function of biological neurons. This paper compared both parametric and non-parametric time series models of patients treated for malaria in Maternal and Child Health Centres in Lagos State, Nigeria. The forecast methods considered linear regression, Integrated Moving Average, ARIMA and SARIMA Modeling for the parametric approach, while Multilayer Perceptron (MLP) and Long Short-Term Memory (LSTM) Network were used for the non-parametric model. The performance of each method is evaluated using the Mean Absolute Error (MAE), R-squared (R2) and Root Mean Square Error (RMSE) as criteria to determine the accuracy of each model. The study revealed that the best performance in terms of error was found in MLP, followed by the LSTM and ARIMA models. In addition, the Bootstrap Aggregating technique was used to make robust forecasts when there are uncertainties in the data.Keywords: ARIMA, bootstrap aggregation, MLP, LSTM, SARIMA, time-series analysis
Procedia PDF Downloads 75650 Forecasting Equity Premium Out-of-Sample with Sophisticated Regression Training Techniques
Authors: Jonathan Iworiso
Abstract:
Forecasting the equity premium out-of-sample is a major concern to researchers in finance and emerging markets. The quest for a superior model that can forecast the equity premium with significant economic gains has resulted in several controversies on the choice of variables and suitable techniques among scholars. This research focuses mainly on the application of Regression Training (RT) techniques to forecast monthly equity premium out-of-sample recursively with an expanding window method. A broad category of sophisticated regression models involving model complexity was employed. The RT models include Ridge, Forward-Backward (FOBA) Ridge, Least Absolute Shrinkage and Selection Operator (LASSO), Relaxed LASSO, Elastic Net, and Least Angle Regression were trained and used to forecast the equity premium out-of-sample. In this study, the empirical investigation of the RT models demonstrates significant evidence of equity premium predictability both statistically and economically relative to the benchmark historical average, delivering significant utility gains. They seek to provide meaningful economic information on mean-variance portfolio investment for investors who are timing the market to earn future gains at minimal risk. Thus, the forecasting models appeared to guarantee an investor in a market setting who optimally reallocates a monthly portfolio between equities and risk-free treasury bills using equity premium forecasts at minimal risk.Keywords: regression training, out-of-sample forecasts, expanding window, statistical predictability, economic significance, utility gains
Procedia PDF Downloads 107649 Enhancement of Growth and Lipid Accumulation in Microalgae with Aggregation Induced Emission-Based Photosensitiser
Authors: Sharmin Ferdewsi Rakhi, A. H. M. Mohsinul Reza, Brynley Davies, Jianzhong Wang, Youhong Tang, Jian Qin
Abstract:
Mass production of microalgae has become a focus of research owing to their promising aspects for sustainable food, biofunctional compounds, and biofuel feedstock. However, low lipid content with optimum algal biomass is still a challenge that must be resolved for commercial use. This research aims to determine the effects of light spectral shift and reactive oxygen species (ROS) on growth and lipid biosynthesis in a green microalga, Chlamydomonas reinhardtii. Aggregation Induced Emission (AIE)-based photosensitisers, CN-TPAQ-PF6 ([C₃₂H₂₃N₄]+) with high ROS productivity, was introduced into the algal culture media separately for effective conversion of the green-yellow-light to the red spectra. The intense photon energy and high-photon flux density in the photosystems and ROS supplementation induced photosynthesis and lipid biogenesis. In comparison to the control, maximum algal growth (0.15 g/l) was achieved at 2 µM CN-TPAQ-PF6 exposure. A significant increase in total lipid accumulation (146.87 mg/g dry biomass) with high proportion of 10-Heptadecanoic acid (C17:1) linolenic acid (C18:2), α-linolenic acid (C18:3) was observed. The elevated level of cellular NADP/NADPH triggered the Acetyl-Co-A production in lipid biogenesis cascade. Furthermore, MTT analysis suggested that this nanomaterial is highly biocompatible on HaCat cell lines with 100% cell viability. This study reveals that the AIE-based approach can strongly impact algal biofactory development for sustainable food, healthy lipids and eco-friendly biofuel.Keywords: microalgae, photosensitiser, lipid, biomass, aggregation-induced-emission, reactive oxygen species
Procedia PDF Downloads 52648 The Factors Predicting Credibility of News in Social Media in Thailand
Authors: Ekapon Thienthaworn
Abstract:
This research aims to study the reliability of the forecasting factor in social media by using survey research methods with questionnaires. The sampling is the group of undergraduate students in Bangkok. A multiple-step random number of 400 persons, data analysis are descriptive statistics with multivariate regression analysis. The research found the average of the overall trust at the intermediate level for reading the news in social media and the results of the multivariate regression analysis to find out the factors that forecast credibility of the media found the only content that has the power to forecast reliability of undergraduate students in Bangkok to reading the news on social media at the significance level.at 0.05.These can be factors with forecasts reliability of news in social media by a variable that has the highest influence factor of the media content and the speed is also important for reliability of the news.Keywords: credibility of news, behaviors and attitudes, social media, web board
Procedia PDF Downloads 468647 Predicting Aggregation Propensity from Low-Temperature Conformational Fluctuations
Authors: Hamza Javar Magnier, Robin Curtis
Abstract:
There have been rapid advances in the upstream processing of protein therapeutics, which has shifted the bottleneck to downstream purification and formulation. Finding liquid formulations with shelf lives of up to two years is increasingly difficult for some of the newer therapeutics, which have been engineered for activity, but their formulations are often viscous, can phase separate, and have a high propensity for irreversible aggregation1. We explore means to develop improved predictive ability from a better understanding of how protein-protein interactions on formulation conditions (pH, ionic strength, buffer type, presence of excipients) and how these impact upon the initial steps in protein self-association and aggregation. In this work, we study the initial steps in the aggregation pathways using a minimal protein model based on square-well potentials and discontinuous molecular dynamics. The effect of model parameters, including range of interaction, stiffness, chain length, and chain sequence, implies that protein models fold according to various pathways. By reducing the range of interactions, the folding- and collapse- transition come together, and follow a single-step folding pathway from the denatured to the native state2. After parameterizing the model interaction-parameters, we developed an understanding of low-temperature conformational properties and fluctuations, and the correlation to the folding transition of proteins in isolation. The model fluctuations increase with temperature. We observe a low-temperature point, below which large fluctuations are frozen out. This implies that fluctuations at low-temperature can be correlated to the folding transition at the melting temperature. Because proteins “breath” at low temperatures, defining a native-state as a single structure with conserved contacts and a fixed three-dimensional structure is misleading. Rather, we introduce a new definition of a native-state ensemble based on our understanding of the core conservation, which takes into account the native fluctuations at low temperatures. This approach permits the study of a large range of length and time scales needed to link the molecular interactions to the macroscopically observed behaviour. In addition, these models studied are parameterized by fitting to experimentally observed protein-protein interactions characterized in terms of osmotic second virial coefficients.Keywords: protein folding, native-ensemble, conformational fluctuation, aggregation
Procedia PDF Downloads 361646 Aggregation-Induced-Active Stimuli-Responsive Based Nano-Objects for Wastewater Treatment Application
Authors: Parvaneh Eskandari, Rachel O'Reilly
Abstract:
In the last years, controlling the self-assembly behavior of stimuli-responsive nano-objects, including micelles, vesicles, worm-like, etc., at different conditions is considered a pertinent challenge in the polymer community. The aim of the project was to synthesize aggregation-induced emission (AIE)-active stimuli-responsive polymeric nano-objects to control the self-assemblies morphologies of the prepared nano-objects. Two types of nanoobjects, micelle and vesicles, including PDMAEMA-b-P(BzMA-TPEMA) [PDMAEMA: poly(N,Ndimethylaminoethyl methacrylate); P(BzMA-TPEMA): poly[benzyl methacrylate-co- tetraphenylethene methacrylate]] were synthesized by using reversible addition−fragmentation chain-transfer (RAFT)- mediated polymerization-induced self-assembly (PISA), which combines polymerization and self-assembly in a single step. Transmission electron microscope and dynamic light scattering (DLS) analysis were used to confirm the formed self-assemblies morphologies. The controlled self-assemblies were applied as nitrophenolic compounds (NPCs) adsorbents from wastewater, thanks to their CO2-responsive part, PDMAEMA. Moreover, the fluorescence-active part of the prepared nano-objects, P(BzMA-TPEMA), played a key role in the detection of the NPCs at the aqueous solution. The optical properties of the prepared nano-objects were studied by UV/Vis and fluorescence spectroscopies. For responsivity investigations, the hydrodynamic diameter and Zeta-potential (ζ-potential) of the sample's aqueous solution were measured by DLS. In the end, the prepared nano-objects were used for the detection and adsorption of different NPCs.Keywords: aggregation-induced emission polymers, stimuli-responsive polymers, reversible addition−fragmentation chain-transfer polymerization, polymerization-induced self-assembly, wastewater treatment
Procedia PDF Downloads 73645 Analysis of Rural Roads in Developing Countries Using Principal Component Analysis and Simple Average Technique in the Development of a Road Safety Performance Index
Authors: Muhammad Tufail, Jawad Hussain, Hammad Hussain, Imran Hafeez, Naveed Ahmad
Abstract:
Road safety performance index is a composite index which combines various indicators of road safety into single number. Development of a road safety performance index using appropriate safety performance indicators is essential to enhance road safety. However, a road safety performance index in developing countries has not been given as much priority as needed. The primary objective of this research is to develop a general Road Safety Performance Index (RSPI) for developing countries based on the facility as well as behavior of road user. The secondary objectives include finding the critical inputs in the RSPI and finding the better method of making the index. In this study, the RSPI is developed by selecting four main safety performance indicators i.e., protective system (seat belt, helmet etc.), road (road width, signalized intersections, number of lanes, speed limit), number of pedestrians, and number of vehicles. Data on these four safety performance indicators were collected using observation survey on a 20 km road section of the National Highway N-125 road Taxila, Pakistan. For the development of this composite index, two methods are used: a) Principal Component Analysis (PCA) and b) Equal Weighting (EW) method. PCA is used for extraction, weighting, and linear aggregation of indicators to obtain a single value. An individual index score was calculated for each road section by multiplication of weights and standardized values of each safety performance indicator. However, Simple Average technique was used for weighting and linear aggregation of indicators to develop a RSPI. The road sections are ranked according to RSPI scores using both methods. The two weighting methods are compared, and the PCA method is found to be much more reliable than the Simple Average Technique.Keywords: indicators, aggregation, principle component analysis, weighting, index score
Procedia PDF Downloads 158644 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution
Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph
Abstract:
In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.Keywords: forecasting, generalized extreme value (GEV), meteorology, return level
Procedia PDF Downloads 478643 A Comparison of Methods for Neural Network Aggregation
Authors: John Pomerat, Aviv Segev
Abstract:
Recently, deep learning has had many theoretical breakthroughs. For deep learning to be successful in the industry, however, there need to be practical algorithms capable of handling many real-world hiccups preventing the immediate application of a learning algorithm. Although AI promises to revolutionize the healthcare industry, getting access to patient data in order to train learning algorithms has not been easy. One proposed solution to this is data- sharing. In this paper, we propose an alternative protocol, based on multi-party computation, to train deep learning models while maintaining both the privacy and security of training data. We examine three methods of training neural networks in this way: Transfer learning, average ensemble learning, and series network learning. We compare these methods to the equivalent model obtained through data-sharing across two different experiments. Additionally, we address the security concerns of this protocol. While the motivating example is healthcare, our findings regarding multi-party computation of neural network training are purely theoretical and have use-cases outside the domain of healthcare.Keywords: neural network aggregation, multi-party computation, transfer learning, average ensemble learning
Procedia PDF Downloads 162642 Exploring the Compatibility of The Rhizome and Complex Adaptive System (CAS) Theory as a Hybrid Urban Strategy Via Aggregation, Nonlinearity, and Flow
Authors: Sudaff Mohammed, Wahda Shuker Al-Hinkawi, Nada Abdulmueen Hasan
Abstract:
The compatibility of the Rhizome and Complex Adaptive system theory as a strategy within the urban context is the essential interest of this paper since there are only a few attempts to establish a hybrid, multi-scalar, and developable strategy based on the concept of the Rhizome and the CAS theory. This paper aims to establish a Rhizomic CAS strategy for different urban contexts by investigating the principles, characteristics, properties, and mechanisms of Rhizome and Complex Adaptive Systems. The research focused mainly on analyzing three properties: aggregation, non-linearity, and flow through the lens of Rhizome, Rhizomatization of CAS properties. The most intriguing result is that the principal and well-investigated characteristics of Complex Adaptive systems can be ‘Rhizomatized’ in two ways; highlighting commonalities between Rhizome and Complex Adaptive systems in addition to using Rhizome-related concepts. This paper attempts to emphasize the potency of the Rhizome as an apparently stochastic and barely anticipatable structure that can be developed to analyze cities of distinctive contexts for formulating better customized urban strategies.Keywords: rhizome, complex adaptive system (CAS), system Theory, urban system, rhizomatic CAS, assemblage, human occupation impulses (HOI)
Procedia PDF Downloads 43641 Forecast of the Small Wind Turbines Sales with Replacement Purchases and with or without Account of Price Changes
Authors: V. Churkin, M. Lopatin
Abstract:
The purpose of the paper is to estimate the US small wind turbines market potential and forecast the small wind turbines sales in the US. The forecasting method is based on the application of the Bass model and the generalized Bass model of innovations diffusion under replacement purchases. In the work an exponential distribution is used for modeling of replacement purchases. Only one parameter of such distribution is determined by average lifetime of small wind turbines. The identification of the model parameters is based on nonlinear regression analysis on the basis of the annual sales statistics which has been published by the American Wind Energy Association (AWEA) since 2001 up to 2012. The estimation of the US average market potential of small wind turbines (for adoption purchases) without account of price changes is 57080 (confidence interval from 49294 to 64866 at P = 0.95) under average lifetime of wind turbines 15 years, and 62402 (confidence interval from 54154 to 70648 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 90,7%, while in the second - 91,8%. The effect of the wind turbines price changes on their sales was estimated using generalized Bass model. This required a price forecast. To do this, the polynomial regression function, which is based on the Berkeley Lab statistics, was used. The estimation of the US average market potential of small wind turbines (for adoption purchases) in that case is 42542 (confidence interval from 32863 to 52221 at P = 0.95) under average lifetime of wind turbines 15 years, and 47426 (confidence interval from 36092 to 58760 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 95,3%, while in the second –95,3%.Keywords: bass model, generalized bass model, replacement purchases, sales forecasting of innovations, statistics of sales of small wind turbines in the United States
Procedia PDF Downloads 348640 PM₁₀ and PM2.5 Concentrations in Bangkok over Last 10 Years: Implications for Air Quality and Health
Authors: Tin Thongthammachart, Wanida Jinsart
Abstract:
Atmospheric particulate matter particles with a diameter less than 10 microns (PM₁₀) and less than 2.5 microns (PM₂.₅) have adverse health effect. The impact from PM was studied from both health and regulatory perspective. Ambient PM data was collected over ten years in Bangkok and vicinity areas of Thailand from 2007 to 2017. Statistical models were used to forecast PM concentrations from 2018 to 2020. Monitoring monthly data averaged concentration of PM₁₀ and PM₂.₅ were used as input to forecast the monthly average concentration of PM. The forecasting results were validated by root means square error (RMSE). The predicted results were used to determine hazard risk for the carcinogenic disease. The health risk values were interpolated with GIS with ordinary kriging technique to create hazard maps in Bangkok and vicinity area. GIS-based maps illustrated the variability of PM distribution and high-risk locations. These evaluated results could support national policy for the sake of human health.Keywords: PM₁₀, PM₂.₅, statistical models, atmospheric particulate matter
Procedia PDF Downloads 159639 Disaggregating and Forecasting the Total Energy Consumption of a Building: A Case Study of a High Cooling Demand Facility
Authors: Juliana Barcelos Cordeiro, Khashayar Mahani, Farbod Farzan, Mohsen A. Jafari
Abstract:
Energy disaggregation has been focused by many energy companies since energy efficiency can be achieved when the breakdown of energy consumption is known. Companies have been investing in technologies to come up with software and/or hardware solutions that can provide this type of information to the consumer. On the other hand, not all people can afford to have these technologies. Therefore, in this paper, we present a methodology for breaking down the aggregate consumption and identifying the highdemanding end-uses profiles. These energy profiles will be used to build the forecast model for optimal control purpose. A facility with high cooling load is used as an illustrative case study to demonstrate the results of proposed methodology. We apply a high level energy disaggregation through a pattern recognition approach in order to extract the consumption profile of its rooftop packaged units (RTUs) and present a forecast model for the energy consumption.Keywords: energy consumption forecasting, energy efficiency, load disaggregation, pattern recognition approach
Procedia PDF Downloads 278638 Forecasting Cancers Cases in Algeria Using Double Exponential Smoothing Method
Authors: Messis A., Adjebli A., Ayeche R., Talbi M., Tighilet K., Louardiane M.
Abstract:
Cancers are the second cause of death worldwide. Prevalence and incidence of cancers is getting increased by aging and population growth. This study aims to predict and modeling the evolution of breast, Colorectal, Lung, Bladder and Prostate cancers over the period of 2014-2019. In this study, data were analyzed using time series analysis with double exponential smoothing method to forecast the future pattern. To describe and fit the appropriate models, Minitab statistical software version 17 was used. Between 2014 and 2019, the overall trend in the raw number of new cancer cases registered has been increasing over time; the change in observations over time has been increasing. Our forecast model is validated since we have good prediction for the period 2020 and data not available for 2021 and 2022. Time series analysis showed that the double exponential smoothing is an efficient tool to model the future data on the raw number of new cancer cases.Keywords: cancer, time series, prediction, double exponential smoothing
Procedia PDF Downloads 89637 Statistical and Land Planning Study of Tourist Arrivals in Greece during 2005-2016
Authors: Dimitra Alexiou
Abstract:
During the last 10 years, in spite of the economic crisis, the number of tourists arriving in Greece has increased, particularly during the tourist season from April to October. In this paper, the number of annual tourist arrivals is studied to explore their preferences with regard to the month of travel, the selected destinations, as well the amount of money spent. The collected data are processed with statistical methods, yielding numerical and graphical results. From the computation of statistical parameters and the forecasting with exponential smoothing, useful conclusions are arrived at that can be used by the Greek tourism authorities, as well as by tourist organizations, for planning purposes for the coming years. The results of this paper and the computed forecast can also be used for decision making by private tourist enterprises that are investing in Greece. With regard to the statistical methods, the method of Simple Exponential Smoothing of time series of data is employed. The search for a best forecast for 2017 and 2018 provides the value of the smoothing coefficient. For all statistical computations and graphics Microsoft Excel is used.Keywords: tourism, statistical methods, exponential smoothing, land spatial planning, economy
Procedia PDF Downloads 265636 PM10 Prediction and Forecasting Using CART: A Case Study for Pleven, Bulgaria
Authors: Snezhana G. Gocheva-Ilieva, Maya P. Stoimenova
Abstract:
Ambient air pollution with fine particulate matter (PM10) is a systematic permanent problem in many countries around the world. The accumulation of a large number of measurements of both the PM10 concentrations and the accompanying atmospheric factors allow for their statistical modeling to detect dependencies and forecast future pollution. This study applies the classification and regression trees (CART) method for building and analyzing PM10 models. In the empirical study, average daily air data for the city of Pleven, Bulgaria for a period of 5 years are used. Predictors in the models are seven meteorological variables, time variables, as well as lagged PM10 variables and some lagged meteorological variables, delayed by 1 or 2 days with respect to the initial time series, respectively. The degree of influence of the predictors in the models is determined. The selected best CART models are used to forecast future PM10 concentrations for two days ahead after the last date in the modeling procedure and show very accurate results.Keywords: cross-validation, decision tree, lagged variables, short-term forecasting
Procedia PDF Downloads 194635 Enhancing Knowledge Graph Convolutional Networks with Structural Adaptive Receptive Fields for Improved Node Representation and Information Aggregation
Authors: Zheng Zhihao
Abstract:
Recently, Knowledge Graph Framework Network (KGCN) has developed powerful capabilities in knowledge representation and reasoning tasks. However, traditional KGCN often uses a fixed weight mechanism when aggregating information, failing to make full use of rich structural information, resulting in a certain expression ability of node representation, and easily causing over-smoothing problems. In order to solve these challenges, the paper proposes an new graph neural network model called KGCN-STAR (Knowledge Graph Convolutional Network with Structural Adaptive Receptive Fields). This model dynamically adjusts the perception of each node by introducing a structural adaptive receptive field. wild range, and a subgraph aggregator is designed to capture local structural information more effectively. Experimental results show that KGCN-STAR shows significant performance improvement on multiple knowledge graph data sets, especially showing considerable capabilities in the task of representation learning of complex structures.Keywords: knowledge graph, graph neural networks, structural adaptive receptive fields, information aggregation
Procedia PDF Downloads 33634 Data Clustering in Wireless Sensor Network Implemented on Self-Organization Feature Map (SOFM) Neural Network
Authors: Krishan Kumar, Mohit Mittal, Pramod Kumar
Abstract:
Wireless sensor network is one of the most promising communication networks for monitoring remote environmental areas. In this network, all the sensor nodes are communicated with each other via radio signals. The sensor nodes have capability of sensing, data storage and processing. The sensor nodes collect the information through neighboring nodes to particular node. The data collection and processing is done by data aggregation techniques. For the data aggregation in sensor network, clustering technique is implemented in the sensor network by implementing self-organizing feature map (SOFM) neural network. Some of the sensor nodes are selected as cluster head nodes. The information aggregated to cluster head nodes from non-cluster head nodes and then this information is transferred to base station (or sink nodes). The aim of this paper is to manage the huge amount of data with the help of SOM neural network. Clustered data is selected to transfer to base station instead of whole information aggregated at cluster head nodes. This reduces the battery consumption over the huge data management. The network lifetime is enhanced at a greater extent.Keywords: artificial neural network, data clustering, self organization feature map, wireless sensor network
Procedia PDF Downloads 517