Search results for: stock predictions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1415

Search results for: stock predictions

635 Effect of Inventory Management on Financial Performance: Evidence from Nigerian Conglomerate Companies

Authors: Adamu Danlami Ahmed

Abstract:

Inventory management is the determinant of effective and efficient work for any manager. This study looked at the relationship between inventory management and financial performance. The population of the study comprises all conglomerate quoted companies in the Nigerian Stock Exchange market as at 31st December 2010. The scope of the study covered the period from 2010 to 2014. Descriptive, Pearson correlation and multiple regressions are used to analyze the data. It was found that inventory management is significantly related to the profitability of the company. This entails that an efficient management of the inventory cycle will enhance the profitability of the company. Also, lack of proper management of it will hinder the financial performance of organizations. Based on the results, it was recommended that a conglomerate company should try to see that inventories are kept to a minimum, as well as make sure the proper checks are maintained to make sure only needed inventories are in the store. As well as to keep track of the movement of goods, in order to avoid unnecessary delay of finished and work in progress (WIP) goods in the store and warehouse.

Keywords: finished goods, work in progress, financial performance, inventory

Procedia PDF Downloads 228
634 Explainable Graph Attention Networks

Authors: David Pham, Yongfeng Zhang

Abstract:

Graphs are an important structure for data storage and computation. Recent years have seen the success of deep learning on graphs such as Graph Neural Networks (GNN) on various data mining and machine learning tasks. However, most of the deep learning models on graphs cannot easily explain their predictions and are thus often labelled as “black boxes.” For example, Graph Attention Network (GAT) is a frequently used GNN architecture, which adopts an attention mechanism to carefully select the neighborhood nodes for message passing and aggregation. However, it is difficult to explain why certain neighbors are selected while others are not and how the selected neighbors contribute to the final classification result. In this paper, we present a graph learning model called Explainable Graph Attention Network (XGAT), which integrates graph attention modeling and explainability. We use a single model to target both the accuracy and explainability of problem spaces and show that in the context of graph attention modeling, we can design a unified neighborhood selection strategy that selects appropriate neighbor nodes for both better accuracy and enhanced explainability. To justify this, we conduct extensive experiments to better understand the behavior of our model under different conditions and show an increase in both accuracy and explainability.

Keywords: explainable AI, graph attention network, graph neural network, node classification

Procedia PDF Downloads 180
633 A Machine Learning Approach for Intelligent Transportation System Management on Urban Roads

Authors: Ashish Dhamaniya, Vineet Jain, Rajesh Chouhan

Abstract:

Traffic management is one of the gigantic issue in most of the urban roads in al-most all metropolitan cities in India. Speed is one of the critical traffic parameters for effective Intelligent Transportation System (ITS) implementation as it decides the arrival rate of vehicles on an intersection which are majorly the point of con-gestions. The study aimed to leverage Machine Learning (ML) models to produce precise predictions of speed on urban roadway links. The research objective was to assess how categorized traffic volume and road width, serving as variables, in-fluence speed prediction. Four tree-based regression models namely: Decision Tree (DT), Random Forest (RF), Extra Tree (ET), and Extreme Gradient Boost (XGB)are employed for this purpose. The models' performances were validated using test data, and the results demonstrate that Random Forest surpasses other machine learning techniques and a conventional utility theory-based model in speed prediction. The study is useful for managing the urban roadway network performance under mixed traffic conditions and effective implementation of ITS.

Keywords: stream speed, urban roads, machine learning, traffic flow

Procedia PDF Downloads 61
632 Study of Parameters Influencing Dwell Times for Trains

Authors: Guillaume Craveur

Abstract:

The work presented here shows a study on several parameters identified as influencing dwell times for trains. Three kinds of rolling stocks are studied for this project and the parameters presented are the number of passengers, the allocation of passengers, their priorities, the platform station height, the door width and the train design. In order to make this study, a lot of records have been done in several stations in Paris (France). Then, in order to study these parameters, numerical simulations are completed. The goal is to quantify the impact of each parameter on the dwelling times. For example, this study highlights the impact of platform height and the presence of steps between the platform and the train. Three types of station platforms are concerned by this study : ‘optimum’ station platform which is 920 mm high, standard station platform which is 550 mm high, and high station platform which is 1150 mm high and different kinds of steps exist in order to fill these gaps. To conclude, this study shows the impact of these parameters on dwell times and their impact in function of the size of population.

Keywords: dwell times, numerical tools, rolling stock, platforms

Procedia PDF Downloads 330
631 Machine Learning Approach for Mutation Testing

Authors: Michael Stewart

Abstract:

Mutation testing is a type of software testing proposed in the 1970s where program statements are deliberately changed to introduce simple errors so that test cases can be validated to determine if they can detect the errors. Test cases are executed against the mutant code to determine if one fails, detects the error and ensures the program is correct. One major issue with this type of testing was it became intensive computationally to generate and test all possible mutations for complex programs. This paper used reinforcement learning and parallel processing within the context of mutation testing for the selection of mutation operators and test cases that reduced the computational cost of testing and improved test suite effectiveness. Experiments were conducted using sample programs to determine how well the reinforcement learning-based algorithm performed with one live mutation, multiple live mutations and no live mutations. The experiments, measured by mutation score, were used to update the algorithm and improved accuracy for predictions. The performance was then evaluated on multiple processor computers. With reinforcement learning, the mutation operators utilized were reduced by 50 – 100%.

Keywords: automated-testing, machine learning, mutation testing, parallel processing, reinforcement learning, software engineering, software testing

Procedia PDF Downloads 192
630 A Deterministic Approach for Solving the Hull and White Interest Rate Model with Jump Process

Authors: Hong-Ming Chen

Abstract:

This work considers the resolution of the Hull and White interest rate model with the jump process. A deterministic process is adopted to model the random behavior of interest rate variation as deterministic perturbations, which is depending on the time t. The Brownian motion and jumps uncertainty are denoted as the integral functions piecewise constant function w(t) and point function θ(t). It shows that the interest rate function and the yield function of the Hull and White interest rate model with jump process can be obtained by solving a nonlinear semi-infinite programming problem. A relaxed cutting plane algorithm is then proposed for solving the resulting optimization problem. The method is calibrated for the U.S. treasury securities at 3-month data and is used to analyze several effects on interest rate prices, including interest rate variability, and the negative correlation between stock returns and interest rates. The numerical results illustrate that our approach essentially generates the yield functions with minimal fitting errors and small oscillation.

Keywords: optimization, interest rate model, jump process, deterministic

Procedia PDF Downloads 158
629 Reproductive Biology of Fringe-Lipped Carp, Labeo fimbriatus (Bloch) from Vanivilas Sagar Reservoir of Karnataka, India

Authors: K. B. Rajanna, P. Nayana, H. N. Anjanayappa, N. Chethan

Abstract:

The ‘fringed - lipped’ peninsula carp Labeo fimbriatus is a potential and an abundant fish species in rivers and reservoirs of peninsular India. It contributes a part of the inland fish production and also plays a role in the rural economy in major carp deficient regions of India. The fish is locally called as ‘Kemmeenu’ in Karnataka. Month wise samples were collected from the Vanivilasa Sagar Reservoir fish landing centre and fishing villages around the reservoir. Present investigation on the reproductive biology showed the occurrence of ripe gonads more during October, November, December and January. Thus it is concluded that spawning season coinciding with monsoon season and the size at maturity was found to be 36 and 37 cm total length (M and F). This study will throw light on reproductive biology of fish for captive brood stock development, breeding and rearing of Labeo fimbriatus. Since this fish is commercial important the study would help to take up hatchery production.

Keywords: inland, maturity, peninsula carp, reservoir

Procedia PDF Downloads 248
628 Prediction of Saturated Hydraulic Conductivity Dynamics in an Iowan Agriculture Watershed

Authors: Mohamed Elhakeem, A. N. Thanos Papanicolaou, Christopher Wilson, Yi-Jia Chang

Abstract:

In this study, a physically-based, modelling framework was developed to predict saturated hydraulic conductivity (KSAT) dynamics in the Clear Creek Watershed (CCW), Iowa. The modelling framework integrated selected pedotransfer functions and watershed models with geospatial tools. A number of pedotransfer functions and agricultural watershed models were examined to select the appropriate models that represent the study site conditions. Models selection was based on statistical measures of the models’ errors compared to the KSAT field measurements conducted in the CCW under different soil, climate and land use conditions. The study has shown that the predictions of the combined pedotransfer function of Rosetta and the Water Erosion Prediction Project (WEPP) provided the best agreement to the measured KSAT values in the CCW compared to the other tested models. Therefore, Rosetta and WEPP were integrated with the Geographic Information System (GIS) tools for visualization of the data in forms of geospatial maps and prediction of KSAT variability in CCW due to the seasonal changes in climate and land use activities.

Keywords: saturated hydraulic conductivity, pedotransfer functions, watershed models, geospatial tools

Procedia PDF Downloads 254
627 Artificial Neural Network and Statistical Method

Authors: Tomas Berhanu Bekele

Abstract:

Traffic congestion is one of the main problems related to transportation in developed as well as developing countries. Traffic control systems are based on the idea of avoiding traffic instabilities and homogenizing traffic flow in such a way that the risk of accidents is minimized and traffic flow is maximized. Lately, Intelligent Transport Systems (ITS) has become an important area of research to solve such road traffic-related issues for making smart decisions. It links people, roads and vehicles together using communication technologies to increase safety and mobility. Moreover, accurate prediction of road traffic is important to manage traffic congestion. The aim of this study is to develop an ANN model for the prediction of traffic flow and to compare the ANN model with the linear regression model of traffic flow predictions. Data extraction was carried out in intervals of 15 minutes from the video player. Video of mixed traffic flow was taken and then counted during office work in order to determine the traffic volume. Vehicles were classified into six categories, namely Car, Motorcycle, Minibus, mid-bus, Bus, and Truck vehicles. The average time taken by each vehicle type to travel the trap length was measured by time displayed on a video screen.

Keywords: intelligent transport system (ITS), traffic flow prediction, artificial neural network (ANN), linear regression

Procedia PDF Downloads 56
626 Near Shore Wave Manipulation for Electricity Generation

Authors: K. D. R. Jagath-Kumara, D. D. Dias

Abstract:

The sea waves carry thousands of GWs of power globally. Although there are a number of different approaches to harness offshore energy, they are likely to be expensive, practically challenging and vulnerable to storms. Therefore, this paper considers using the near shore waves for generating mechanical and electrical power. It introduces two new approaches, the wave manipulation and using a variable duct turbine, for intercepting very wide wave fronts and coping with the fluctuations of the wave height and the sea level, respectively. The first approach effectively allows capturing much more energy yet with a much narrower turbine rotor. The second approach allows using a rotor with a smaller radius but captures energy of higher wave fronts at higher sea levels yet preventing it from totally submerging. To illustrate the effectiveness of the approach, the paper contains a description and the simulation results of a scale model of a wave manipulator. Then, it includes the results of testing a physical model of the manipulator and a single duct, axial flow turbine, in a wave flume in the laboratory. The paper also includes comparisons of theoretical predictions, simulation results and wave flume tests with respect to the incident energy, loss in wave manipulation, minimal loss, brake torque and the angular velocity.

Keywords: near-shore sea waves, renewable energy, wave energy conversion, wave manipulation

Procedia PDF Downloads 474
625 Modified Plastic-Damage Model for FRP-Confined Repaired Concrete Columns

Authors: I. A Tijani, Y. F Wu, C.W. Lim

Abstract:

Concrete Damaged Plasticity Model (CDPM) is capable of modeling the stress-strain behavior of confined concrete. Nevertheless, the accuracy of the model largely depends on its parameters. To date, most research works mainly focus on the identification and modification of the parameters for fiber reinforced polymer (FRP) confined concrete prior to damage. And, it has been established that the FRP-strengthened concrete behaves differently to FRP-repaired concrete. This paper presents a modified plastic damage model within the context of the CDPM in ABAQUS for modelling of a uniformly FRP-confined repaired concrete under monotonic loading. The proposed model includes infliction damage, elastic stiffness, yield criterion and strain hardening rule. The distinct feature of damaged concrete is elastic stiffness reduction; this is included in the model. Meanwhile, the test results were obtained from a physical testing of repaired concrete. The dilation model is expressed as a function of the lateral stiffness of the FRP-jacket. The finite element predictions are shown to be in close agreement with the obtained test results of the repaired concrete. It was observed from the study that with necessary modifications, finite element method is capable of modeling FRP-repaired concrete structures.

Keywords: Concrete, FRP, Damage, Repairing, Plasticity, and Finite element method

Procedia PDF Downloads 133
624 Determination of the Risks of Heart Attack at the First Stage as Well as Their Control and Resource Planning with the Method of Data Mining

Authors: İbrahi̇m Kara, Seher Arslankaya

Abstract:

Frequently preferred in the field of engineering in particular, data mining has now begun to be used in the field of health as well since the data in the health sector have reached great dimensions. With data mining, it is aimed to reveal models from the great amounts of raw data in agreement with the purpose and to search for the rules and relationships which will enable one to make predictions about the future from the large amount of data set. It helps the decision-maker to find the relationships among the data which form at the stage of decision-making. In this study, it is aimed to determine the risk of heart attack at the first stage, to control it, and to make its resource planning with the method of data mining. Through the early and correct diagnosis of heart attacks, it is aimed to reveal the factors which affect the diseases, to protect health and choose the right treatment methods, to reduce the costs in health expenditures, and to shorten the durations of patients’ stay at hospitals. In this way, the diagnosis and treatment costs of a heart attack will be scrutinized, which will be useful to determine the risk of the disease at the first stage, to control it, and to make its resource planning.

Keywords: data mining, decision support systems, heart attack, health sector

Procedia PDF Downloads 353
623 Application of Adaptive Neuro Fuzzy Inference Systems Technique for Modeling of Postweld Heat Treatment Process of Pressure Vessel Steel AASTM A516 Grade 70

Authors: Omar Al Denali, Abdelaziz Badi

Abstract:

The ASTM A516 Grade 70 steel is a suitable material used for the fabrication of boiler pressure vessels working in moderate and lower temperature services, and it has good weldability and excellent notch toughness. The post-weld heat treatment (PWHT) or stress-relieving heat treatment has significant effects on avoiding the martensite transformation and resulting in high hardness, which can lead to cracking in the heat-affected zone (HAZ). An adaptive neuro-fuzzy inference system (ANFIS) was implemented to predict the material tensile strength of post-weld heat treatment (PWHT) experiments. The ANFIS models presented excellent predictions, and the comparison was carried out based on the mean absolute percentage error between the predicted values and the experimental values. The ANFIS model gave a Mean Absolute Percentage Error of 0.556 %, which confirms the high accuracy of the model.

Keywords: prediction, post-weld heat treatment, adaptive neuro-fuzzy inference system, mean absolute percentage error

Procedia PDF Downloads 146
622 Distributed Acoustic Sensing Signal Model under Static Fiber Conditions

Authors: G. Punithavathy

Abstract:

The research proposes a statistical model for the distributed acoustic sensor interrogation units that broadcast a laser pulse into the fiber optics, where interactions within the fiber determine the localized acoustic energy that causes light reflections known as backscatter. The backscattered signal's amplitude and phase can be calculated using explicit equations. The created model makes amplitude signal spectrum and autocorrelation predictions that are confirmed by experimental findings. Phase signal characteristics that are useful for researching optical time domain reflectometry (OTDR) system sensing applications are provided and examined, showing good agreement with the experiment. The experiment was successfully done with the use of Python coding. In this research, we can analyze the entire distributed acoustic sensing (DAS) component parts separately. This model assumes that the fiber is in a static condition, meaning that there is no external force or vibration applied to the cable, that means no external acoustic disturbances present. The backscattered signal consists of a random noise component, which is caused by the intrinsic imperfections of the fiber, and a coherent component, which is due to the laser pulse interacting with the fiber.

Keywords: distributed acoustic sensing, optical fiber devices, optical time domain reflectometry, Rayleigh scattering

Procedia PDF Downloads 68
621 Traffic Congestions Modeling and Predictions by Social Networks

Authors: Bojan Najdenov, Danco Davcev

Abstract:

Reduction of traffic congestions and the effects of pollution and waste of resources that come with them has been a big challenge in the past decades. Having reliable systems to facilitate the process of modeling and prediction of traffic conditions would not only reduce the environmental pollution, but will also save people time and money. Social networks play big role of people’s lives nowadays providing them means of communicating and sharing thoughts and ideas, that way generating huge knowledge bases by crowdsourcing. In addition to that, crowdsourcing as a concept provides mechanisms for fast and relatively reliable data generation and also many services are being used on regular basis because they are mainly powered by the public as main content providers. In this paper we present the Social-NETS-Traffic-Control System (SNTCS) that should serve as a facilitator in the process of modeling and prediction of traffic congestions. The main contribution of our system is to integrate data from social networks as Twitter and also implements a custom created crowdsourcing subsystem with which users report traffic conditions using an android application. Our first experience of the usage of the system confirms that the integrated approach allows easy extension of the system with other social networks and represents a very useful tool for traffic control.

Keywords: traffic, congestion reduction, crowdsource, social networks, twitter, android

Procedia PDF Downloads 474
620 Examining Foreign Student Visual Perceptions of Online Marketing Tools at a Hungarian University

Authors: Anita Kéri

Abstract:

Higher education marketing has been a widely researched field in recent years. Due to the increasing competition among higher education institutions worldwide, it has become crucial to target foreign students with effective marketing tools. Online marketing tools became central to attracting, retaining, and satisfying the needs of foreign students. Therefore, the aim of the current study is to reveal how the online marketing tools of a Hungarian university are perceived visually by its first-year foreign students, with special emphasis on the university webpage content. Eye-camera tracking and retrospective think-aloud interviews were used to measure visual perceptions. Results show that freshmen students remember those online marketing content more that has familiar content on them. Pictures of real-life students and their experiences attract students’ attention more, and they also remember information on these webpage elements more, compared to designs with stock photos. This research is novel in the sense that it uses eye-camera tracking in the field of higher education marketing, thereby providing insight into the perception of online higher education marketing for foreign students.

Keywords: higher education, marketing, eye-camera, visual perceptions

Procedia PDF Downloads 94
619 Support Vector Regression Combined with Different Optimization Algorithms to Predict Global Solar Radiation on Horizontal Surfaces in Algeria

Authors: Laidi Maamar, Achwak Madani, Abdellah El Ahdj Abdellah

Abstract:

The aim of this work is to use Support Vector regression (SVR) combined with dragonfly, firefly, Bee Colony and particle swarm Optimization algorithm to predict global solar radiation on horizontal surfaces in some cities in Algeria. Combining these optimization algorithms with SVR aims principally to enhance accuracy by fine-tuning the parameters, speeding up the convergence of the SVR model, and exploring a larger search space efficiently; these parameters are the regularization parameter (C), kernel parameters, and epsilon parameter. By doing so, the aim is to improve the generalization and predictive accuracy of the SVR model. Overall, the aim is to leverage the strengths of both SVR and optimization algorithms to create a more powerful and effective regression model for various cities and under different climate conditions. Results demonstrate close agreement between predicted and measured data in terms of different metrics. In summary, SVM has proven to be a valuable tool in modeling global solar radiation, offering accurate predictions and demonstrating versatility when combined with other algorithms or used in hybrid forecasting models.

Keywords: support vector regression (SVR), optimization algorithms, global solar radiation prediction, hybrid forecasting models

Procedia PDF Downloads 27
618 Spatially Distributed Rainfall Prediction Based on Automated Kriging for Landslide Early Warning Systems

Authors: Ekrem Canli, Thomas Glade

Abstract:

The precise prediction of rainfall in space and time is a key element to most landslide early warning systems. Unfortunately, the spatial variability of rainfall in many early warning applications is often disregarded. A common simplification is to use uniformly distributed rainfall to characterize aerial rainfall intensity. With spatially differentiated rainfall information, real-time comparison with rainfall thresholds or the implementation in process-based approaches might form the basis for improved landslide warnings. This study suggests an automated workflow from the hourly, web-based collection of rain gauge data to the generation of spatially differentiated rainfall predictions based on kriging. Because the application of kriging is usually a labor intensive task, a simplified and consequently automated variogram modeling procedure was applied to up-to-date rainfall data. The entire workflow was carried out purely with open source technology. Validation results, albeit promising, pointed out the challenges that are involved in pure distance based, automated geostatistical interpolation techniques for ever-changing environmental phenomena over short temporal and spatial extent.

Keywords: kriging, landslide early warning system, spatial rainfall prediction, variogram modelling, web scraping

Procedia PDF Downloads 277
617 Enhancing Reused Lubricating Oil Performance Using Novel Ionic Liquids Based on Imidazolium Derivatives

Authors: Mohamed Deyab

Abstract:

The global lubricant additives market size was USD 14.35 billion in 2015. The industry is characterized by increasing additive usage in base oil blending for longer service life and performance. These additives improve the viscosity of oil, act as detergents, defoamers, antioxidants, and antiwear agents. Since additives play a significant role in base oil blending and subsequent formulations as they are critical materials in improving specification and performance of oils. Herein, we report on the synthesis and characterization of three imidazolium derivatives and their application as antioxidants, detergents and antiwear agents. The molecular structure and characterizations of these ionic liquids were confirmed by elemental analysis, FTIR, X-Ray Diffraction (XRD) and 1HNMR spectroscopy. Thermo gravimetric analysis (TGA), is used to study the degradation and thermal stability of the studied base stock samples. It was found that all the prepared ionic liquids additives have excellent power of dispersion and detergency. The ionic liquids as additives to engine oil reduced the friction (38%) and wear volume (76%) of steel balls. The obtained results show that the ionic liquids have an oxidation inhibitor up to 95%.

Keywords: reused lubricating oil, waste, petroleum, ionic liquids

Procedia PDF Downloads 129
616 Volatility Switching between Two Regimes

Authors: Josip Visković, Josip Arnerić, Ante Rozga

Abstract:

Based on the fact that volatility is time varying in high frequency data and that periods of high volatility tend to cluster, the most successful and popular models in modelling time varying volatility are GARCH type models. When financial returns exhibit sudden jumps that are due to structural breaks, standard GARCH models show high volatility persistence, i.e. integrated behaviour of the conditional variance. In such situations models in which the parameters are allowed to change over time are more appropriate. This paper compares different GARCH models in terms of their ability to describe structural changes in returns caused by financial crisis at stock markets of six selected central and east European countries. The empirical analysis demonstrates that Markov regime switching GARCH model resolves the problem of excessive persistence and outperforms uni-regime GARCH models in forecasting volatility when sudden switching occurs in response to financial crisis.

Keywords: central and east European countries, financial crisis, Markov switching GARCH model, transition probabilities

Procedia PDF Downloads 220
615 Carbon Sequestration under Hazelnut (Corylus avellana) Agroforestry and Adjacent Land Uses in the Vicinity of Black Sea, Trabzon, Turkey

Authors: Mohammed Abaoli Abafogi, Sinem Satiroglu, M. Misir

Abstract:

The current study has addressed the effect of Hazelnut (Corylus avellana) agroforestry on carbon sequestration. Eight sample plots were collected from Hazelnut (Corylus avellana) agroforestry using random sampling method. The diameter of all trees in each plot with ≥ 2cm at 1.3m DBH was measured by using a calliper. Average diameter, aboveground biomass, and carbon stock were calculated for each plot. Comparative data for natural forestland was used for C was taken from KTU, and the soil C was converted from the biomass conversion equation. Biomass carbon was significantly higher in the Natural forest (68.02Mgha⁻¹) than in the Hazelnut agroforestry (16.89Mgha⁻¹). SOC in Hazelnut agroforestry, Natural forest, and arable agricultural land were 7.70, 385.85, and 0.00 Mgha⁻¹ respectively. Biomass C, on average accounts for only 0.00% of the total C in arable agriculture, and 11.02% for the Hazelnut agroforestry while 88.05% for Natural forest. The result shows that the conversion of arable crop field to Hazelnut agroforestry can sequester a large amount of C in the soil as well as in the biomass than Arable agricultural lands.

Keywords: arable agriculture, biomass carbon, carbon sequestration, hazelnut (Corylus avellana) agroforestry, soil organic carbon

Procedia PDF Downloads 301
614 Impact of Legs Geometry on the Efficiency of Thermoelectric Devices

Authors: Angel Fabian Mijangos, Jaime Alvarez Quintana

Abstract:

Key concepts like waste heat recycling or waste heat recovery are the basic ideas in thermoelectricity so as to the design the newest solid state sources of energy for a stable supply of electricity and environmental protection. According to several theoretical predictions; at device level, the geometry and configuration of the thermoelectric legs are crucial in the thermoelectric performance of the thermoelectric modules. Thus, in this work, it has studied the geometry effect of legs on the thermoelectric figure of merit ZT of the device. First, asymmetrical legs are proposed in order to reduce the overall thermal conductance of the device so as to increase the temperature gradient in the legs, as well as by harnessing the Thomson effect, which is generally neglected in conventional symmetrical thermoelectric legs. It has been developed a novel design of a thermoelectric module having asymmetrical legs, and by first time it has been validated experimentally its thermoelectric performance by realizing a proof-of-concept device which shows to have almost twofold the thermoelectric figure of merit as compared to conventional one. Moreover, it has been also varied the length of thermoelectric legs in order to analyze its effect on the thermoelectric performance of the device. Along with this, it has studied the impact of contact resistance in these systems. Experimental results show that device architecture can improve up to twofold the thermoelectric performance of the device.

Keywords: asymmetrical legs, heat recovery, heat recycling, thermoelectric module, Thompson effect

Procedia PDF Downloads 230
613 A Data Science Pipeline for Algorithmic Trading: A Comparative Study in Applications to Finance and Cryptoeconomics

Authors: Luyao Zhang, Tianyu Wu, Jiayi Li, Carlos-Gustavo Salas-Flores, Saad Lahrichi

Abstract:

Recent advances in AI have made algorithmic trading a central role in finance. However, current research and applications are disconnected information islands. We propose a generally applicable pipeline for designing, programming, and evaluating algorithmic trading of stock and crypto tokens. Moreover, we provide comparative case studies for four conventional algorithms, including moving average crossover, volume-weighted average price, sentiment analysis, and statistical arbitrage. Our study offers a systematic way to program and compare different trading strategies. Moreover, we implement our algorithms by object-oriented programming in Python3, which serves as open-source software for future academic research and applications.

Keywords: algorithmic trading, AI for finance, fintech, machine learning, moving average crossover, volume weighted average price, sentiment analysis, statistical arbitrage, pair trading, object-oriented programming, python3

Procedia PDF Downloads 132
612 Cost-Optimized Extra-Lateral Transshipments

Authors: Dilupa Nakandala, Henry Lau

Abstract:

Ever increasing demand for cost efficiency and customer satisfaction through reliable delivery have been a mandate for logistics practitioners to continually improve inventory management processes. With the cost optimization objectives, this study considers an extended scenario where sourcing from the same echelon of the supply chain, known as lateral transshipment which is instantaneous but more expensive than purchasing from regular suppliers, is considered by warehouses not only to re-actively fulfill the urgent outstanding retailer demand that could not be fulfilled by stock on hand but also for preventively reduce back-order cost. Such extra lateral trans-shipments as preventive responses are intended to meet the expected demand during the supplier lead time in a periodic review ordering policy setting. We develop decision rules to assist logistics practitioners to make cost optimized selection between back-ordering and combined reactive and proactive lateral transshipment options. A method for determining the optimal quantity of extra lateral transshipment is developed considering the trade-off between purchasing, holding and backorder cost components.

Keywords: lateral transshipment, warehouse inventory management, cost optimization, preventive transshipment

Procedia PDF Downloads 609
611 Energy Audit and Renovation Scenarios for a Historical Building in Rome: A Pilot Case Towards the Zero Emission Building Goal

Authors: Domenico Palladino, Nicolandrea Calabrese, Francesca Caffari, Giulia Centi, Francesca Margiotta, Giovanni Murano, Laura Ronchetti, Paolo Signoretti, Lisa Volpe, Silvia Di Turi

Abstract:

The aim to achieve a fully decarbonized building stock by 2050 stands as one of the most challenging issues within the spectrum of energy and climate objectives. Numerous strategies are imperative, particularly emphasizing the reduction and optimization of energy demand. Ensuring the high energy performance of buildings emerges as a top priority, with measures aimed at cutting energy consumptions. Concurrently, it is imperative to decrease greenhouse gas emissions by using renewable energy sources for the on-site energy production, thereby striving for an energy balance leading towards zero-emission buildings. Italy's predominant building stock comprises ancient buildings, many of which hold historical significance and are subject to stringent preservation and conservation regulations. Attaining high levels of energy efficiency and reducing CO2 emissions in such buildings poses a considerable challenge, given their unique characteristics and the imperative to adhere to principles of conservation and restoration. Additionally, conducting a meticulous analysis of these buildings' current state is crucial for accurately quantifying their energy performance and predicting the potential impacts of proposed renovation strategies on energy consumption reduction. Within this framework, the paper presents a pilot case in Rome, outlining a methodological approach for the renovation of historic buildings towards achieving Zero Emission Building (ZEB) objective. The building has a mixed function with offices, a conference hall, and an exposition area. The building envelope is made of historical and precious materials used as cladding which must be preserved. A thorough understanding of the building's current condition serves as a prerequisite for analyzing its energy performance. This involves conducting comprehensive archival research, undertaking on-site diagnostic examinations to characterize the building envelope and its systems, and evaluating actual energy usage data derived from energy bills. Energy simulations and audit are the first step in the analysis with the assessment of the energy performance of the actual current state. Subsequently, different renovation scenarios are proposed, encompassing advanced building techniques, to pinpoint the key actions necessary for improving mechanical systems, automation and control systems, and the integration of renewable energy production. These scenarios entail different levels of renovation, ranging from meeting minimum energy performance goals to achieving the highest possible energy efficiency level. The proposed interventions are meticulously analyzed and compared to ascertain the feasibility of attaining the Zero Emission Building objective. In conclusion, the paper provides valuable insights that can be extrapolated to inform a broader approach towards energy-efficient refurbishment of historical buildings that may have limited potential for renovation in their building envelopes. By adopting a methodical and nuanced approach, it is possible to reconcile the imperative of preserving cultural heritage with the pressing need to transition towards a sustainable, low-carbon future.

Keywords: energy conservation and transition, energy efficiency in historical buildings, buildings energy performance, energy retrofitting, zero emission buildings, energy simulation

Procedia PDF Downloads 58
610 Management of Therapeutic Anticancer at Oran Teaching Hospital, Algeria

Authors: S. Boulenouar, M. Sefir, M. Benahmed

Abstract:

All facilities need medication and other pharmaceuticals for their operation. Management and supply is therefore to provide the different services of the facility goods and services in required quantity and quality. The permanent availability of drugs in the facilities is very difficult because most face many difficulties at the inventory management and drug supplies. Therefore, it is necessary for each health facility to know the causes for the malfunction of its management system to cope with them. It is in this context that we have undertaken to conduct this study to know the causes which should be taken into consideration by the concerned authorities to carry out their mission, which is to provide quality health care for the population. In terms of financial resources, the budget for medicines represents a significant part of the budget of the pharmacy. Our study shows that the share of the hospital budget reserved for the drugs procurement represent on average 70% of the budget of the pharmacy. The results show a state of lack of anticancer drugs at Oran teaching hospital. The analysis of the management process allowed us to know the level that the problem of stock-outs of anti-cancer drugs is at. Suggestions were made to that effect to improve the availability for these products and to respond better to the needs of patients.

Keywords: anticancer drugs, health care facility, budget, hospital pharmacist, hospital service

Procedia PDF Downloads 435
609 Effect of Vinclozolin on Some Biochemical Parameters of Galleria mellonella (Lepidoptera: Pyralidae)

Authors: Rahile Ozturk, Esra Maltas

Abstract:

This study aimed to determine the effect of vinclozolin on some biochemical characteristics of Galleria mellonella (Lepidoptera: Pyralidae) which is an economically harmful species damaging the honeycomb in beekeeping. For experimental groups, the eggs obtained from stock were dropped into the mixed feed of vinclozolin at different doses (20, 40 and 60 ppm) and had the larvae fed with this feed. As result of the addition of vinclozolin at concentrations of 20, 40 and 60 ppm, glycogen contents of G. mellonella were determined and a significant reduction in the amount of glycogen was observed with increasing concentration of vinclozolin. In this study, activity of catalase enzyme, particularly effective in defense mechanism, activity of xanthine oxidase involved in nucleotide metabolism and activity of glucose oxidase in the metabolism of carbohydrates were measured. When compared with the results from control groups, the enzyme activities of the larvaes fed with the feed including 20, 40 and 60 ppm of vinclozolin were observed to vary or remain constant. Accordingly, glucose oxidase and catalase activities increased with the increase in amount of vinclozolin in the feed and the activity of xanthine oxidase remained stable.

Keywords: Catalase, Galleria mellonella, glucose oxidase, vinclozolin, xanthine oxidase.

Procedia PDF Downloads 289
608 Forecasting 24-Hour Ahead Electricity Load Using Time Series Models

Authors: Ramin Vafadary, Maryam Khanbaghi

Abstract:

Forecasting electricity load is important for various purposes like planning, operation, and control. Forecasts can save operating and maintenance costs, increase the reliability of power supply and delivery systems, and correct decisions for future development. This paper compares various time series methods to forecast 24 hours ahead of electricity load. The methods considered are the Holt-Winters smoothing, SARIMA Modeling, LSTM Network, Fbprophet, and Tensorflow probability. The performance of each method is evaluated by using the forecasting accuracy criteria, namely, the mean absolute error and root mean square error. The National Renewable Energy Laboratory (NREL) residential energy consumption data is used to train the models. The results of this study show that the SARIMA model is superior to the others for 24 hours ahead forecasts. Furthermore, a Bagging technique is used to make the predictions more robust. The obtained results show that by Bagging multiple time-series forecasts, we can improve the robustness of the models for 24 hours ahead of electricity load forecasting.

Keywords: bagging, Fbprophet, Holt-Winters, LSTM, load forecast, SARIMA, TensorFlow probability, time series

Procedia PDF Downloads 87
607 An Empirical Examination of the Determinant of the Financial CEOs’ Compensation for the Post-Financial Crisis Period

Authors: Eunsup Daniel Shim, Jooh Lee

Abstract:

The US financial crisis of 2008 and subsequent Global Financial Crisis were considered by many economists the worst financial crisis since the Great Depression of the 1930s. As a results, Dodd-Frank Act has passed and aims '(1) to promote the financial stability of the United States by improving accountability and transparency in the financial system, to end "too big to fail", (2) to protect the American taxpayer by ending bailouts, (3) to protect consumers from abusive financial services practices, and for other purposes.' The enactment of Dodd-Frank Act, in part, intended to significantly influence accountability on executive compensation especially for the financial institutions. This paper empirically investigates the changes in Financial CEOs’ compensation since the Financial Crisis of 2008. Our findings show that in the post- Financial Crisis period financial leverage is significant factor influencing the CEOs’ total compensation. In addition market based performance such as stock price and market-to-book ratio shows significant positive relationship with CEO compensation. This change can be interpreted an attempt to reduce opportunistic behavior of top executives after the financial crisis and the enactment of the Dodd-Frank Act.

Keywords: financial CEO compensation, firm performance, financial crisis of 2008, dodd-frank act

Procedia PDF Downloads 514
606 The Transformation of Architecture through the Technological Developments in History: Future Architecture Scenario

Authors: Adel Gurel, Ozge Ceylin Yildirim

Abstract:

Nowadays, design and architecture are being affected and underwent change with the rapid advancements in technology, economics, politics, society and culture. Architecture has been transforming with the latest developments after the inclusion of computers into design. Integration of design into the computational environment has revolutionized the architecture and new perspectives in architecture have been gained. The history of architecture shows the various technological developments and changes in which the architecture has transformed with time. Therefore, the analysis of integration between technology and the history of the architectural process makes it possible to build a consensus on the idea of how architecture is to proceed. In this study, each period that occurs with the integration of technology into architecture is addressed within historical process. At the same time, changes in architecture via technology are identified as important milestones and predictions with regards to the future of architecture have been determined. Developments and changes in technology and the use of technology in architecture within years are analyzed in charts and graphs comparatively. The historical process of architecture and its transformation via technology are supported with detailed literature review and they are consolidated with the examination of focal points of 20th-century architecture under the titles; parametric design, genetic architecture, simulation, and biomimicry. It is concluded that with the historical research between past and present; the developments in architecture cannot keep up with the advancements in technology and recent developments in technology overshadow the architecture, even the technology decides the direction of architecture. As a result, a scenario is presented with regards to the reach of technology in the future of architecture and the role of the architect.

Keywords: computer technologies, future architecture, scientific developments, transformation

Procedia PDF Downloads 181