Search results for: arrival time prediction
18098 Integrating Multiple Types of Value in Natural Capital Accounting Systems: Environmental Value Functions
Authors: Pirta Palola, Richard Bailey, Lisa Wedding
Abstract:
Societies and economies worldwide fundamentally depend on natural capital. Alarmingly, natural capital assets are quickly depreciating, posing an existential challenge for humanity. The development of robust natural capital accounting systems is essential for transitioning towards sustainable economic systems and ensuring sound management of capital assets. However, the accurate, equitable and comprehensive estimation of natural capital asset stocks and their accounting values still faces multiple challenges. In particular, the representation of socio-cultural values held by groups or communities has arguably been limited, as to date, the valuation of natural capital assets has primarily been based on monetary valuation methods and assumptions of individual rationality. People relate to and value the natural environment in multiple ways, and no single valuation method can provide a sufficiently comprehensive image of the range of values associated with the environment. Indeed, calls have been made to improve the representation of multiple types of value (instrumental, intrinsic, and relational) and diverse ontological and epistemological perspectives in environmental valuation. This study addresses this need by establishing a novel valuation framework, Environmental Value Functions (EVF), that allows for the integration of multiple types of value in natural capital accounting systems. The EVF framework is based on the estimation and application of value functions, each of which describes the relationship between the value and quantity (or quality) of an ecosystem component of interest. In this framework, values are estimated in terms of change relative to the current level instead of calculating absolute values. Furthermore, EVF was developed to also support non-marginalist conceptualizations of value: it is likely that some environmental values cannot be conceptualized in terms of marginal changes. For example, ecological resilience value may, in some cases, be best understood as a binary: it either exists (1) or is lost (0). In such cases, a logistic value function may be used as the discriminator. Uncertainty in the value function parameterization can be considered through, for example, Monte Carlo sampling analysis. The use of EVF is illustrated with two conceptual examples. For the first time, EVF offers a clear framework and concrete methodology for the representation of multiple types of value in natural capital accounting systems, simultaneously enabling 1) the complementary use and integration of multiple valuation methods (monetary and non-monetary); 2) the synthesis of information from diverse knowledge systems; 3) the recognition of value incommensurability; 4) marginalist and non-marginalist value analysis. Furthermore, with this advancement, the coupling of EVF and ecosystem modeling can offer novel insights to the study of spatial-temporal dynamics in natural capital asset values. For example, value time series can be produced, allowing for the prediction and analysis of volatility, long-term trends, and temporal trade-offs. This approach can provide essential information to help guide the transition to a sustainable economy.Keywords: economics of biodiversity, environmental valuation, natural capital, value function
Procedia PDF Downloads 19418097 Attitudes of the Indigenous People from Providencia, Amazon towards the Bora Language
Authors: Angela Maria Sarmiento
Abstract:
Since the end of the 19th century, the Bora people struggled to survive two stages of colonial domination, which resulted in situations of forced contact with the Western world. Their inclusion in global designs altered the configuration of their local spaces and social practices; thus the Bora language was affected and prone to transformation. This descriptive, interpretive study, within the indigenous and minoritized groups’ research field, aimed at analysing the linguistic attitudes as well as the contextual situation of the Bora language in Providencia, an ancestral territory and a speech community contained in the midst of the Colombian Amazon rainforest. Through the inquiry of their sociolinguistic practices, this study also considered the effects of the course of events derived from the rubber exploitation in the late 19th century, and the arrival of the Capuchin’s mission in the early 20th century. The methodology used in this study had an ethnographic approach, which allowed the researcher to study the social phenomena from the perspective of the participants. Fieldwork, diary, field notes, and semi-structured interviews were conducted and then triangulated with participant observations. The findings of this study suggest that there is a transition from current individual bilingualism towards Spanish monolingualism; this is enhanced by the absence of a functional distribution of the three varieties (Bora, Huitoto, and Spanish). Also, the positive attitudes towards the Spanish language are based on its functionality while positive attitudes towards the Bora language mostly refer to pride and identity. Negative attitudes are only directed towards the Bora language. In the search for the roots of these negative attitudes, appeared the traumatic experiences of the rubber exploitation and the indigenous experiences at the capuchin’s boarding school. Finally, the situation of the Bora language can be configured as a social fact strongly connected to previous years of colonial dominations and to the current and continuous incursion of new global-colonial designs.Keywords: Bora language, language contact, linguistic attitudes, speech communities
Procedia PDF Downloads 14718096 Entropy Risk Factor Model of Exchange Rate Prediction
Authors: Darrol Stanley, Levan Efremidze, Jannie Rossouw
Abstract:
We investigate the predictability of the USD/ZAR (South African Rand) exchange rate with sample entropy analytics for the period of 2004-2015. We calculate sample entropy based on the daily data of the exchange rate and conduct empirical implementation of several market timing rules based on these entropy signals. The dynamic investment portfolio based on entropy signals produces better risk adjusted performance than a buy and hold strategy. The returns are estimated on the portfolio values in U.S. dollars. These results are preliminary and do not yet account for reasonable transactions costs, although these are very small in currency markets.Keywords: currency trading, entropy, market timing, risk factor model
Procedia PDF Downloads 27118095 Early Phase Design Study of a Sliding Door with Multibody Simulations
Authors: Erkan Talay, Mustafa Yigit Yagci
Abstract:
For the systems like sliding door, designers should predict not only strength but also dynamic behavior of the system and this prediction usually becomes more critical if design has radical changes refer to previous designs. Also, sometimes physical tests could cost more than expected, especially for rail geometry changes, since this geometry affects design of the body. The aim of the study is to observe and understand the dynamics of the sliding door in virtual environment. For this, multibody dynamic model of the sliding door was built and then affects of various parameters like rail geometry, roller diameters, or center of mass detected. Also, a design of experiment study was performed to observe interactions of these parameters.Keywords: design of experiment, minimum closing effort, multibody simulation, sliding door
Procedia PDF Downloads 13718094 The Connection between Required Safe Egress Time and Occupant Fire Safety Training
Authors: Christina Knorr
Abstract:
Analysis of the evacuation of occupants of a building plays a significant role in Fire Safety Engineering. One of the tools used for the analysis is the concept of the Required Safe Egress Time (RSET). It is generally accepted that RSET is measured from the time the fire ignites until the time that all occupants have evacuated to a safe location. Instructions on how RSET is determined can be found in both the International Fire Engineering Guidelines and, more recently, in the Australian Fire Engineering Guidelines. The guidelines also specify measures that could be applied to reduce the RSET and hence improve the performance of fire-safety measures of a building. Further, it is suggested that the delay period can be reduced through “training programs.” This study examined the overall level of fire-safety awareness among occupants of residential apartment buildings in Australia and investigated the possible effects of fire-safety training on the delay period and, hence, the RSET. A questionnaire, interviews, and an experiment were conducted to collect data about people’s fire-safety knowledge, people’s behaviour and nature, and the duration of activities people are likely to undertake in the event of a fire. The study led to an investigation into the delay and response time approximations and the development of a new equation to incorporate the impact of training into the RSET calculations for the general use of the fire engineering community. Regardless of the RSET, it can be concluded that fire-safety education and training for residents of apartment buildings have a direct impact on improving their behaviour and firefighting equipment usage in a fire incident.Keywords: fire safety engineering, fire safety training, occupant evacuation behaviour, required safe egress time
Procedia PDF Downloads 3818093 Process Modeling of Electric Discharge Machining of Inconel 825 Using Artificial Neural Network
Authors: Himanshu Payal, Sachin Maheshwari, Pushpendra S. Bharti
Abstract:
Electrical discharge machining (EDM), a non-conventional machining process, finds wide applications for shaping difficult-to-cut alloys. Process modeling of EDM is required to exploit the process to the fullest. Process modeling of EDM is a challenging task owing to involvement of so many electrical and non-electrical parameters. This work is an attempt to model the EDM process using artificial neural network (ANN). Experiments were carried out on die-sinking EDM taking Inconel 825 as work material. ANN modeling has been performed using experimental data. The prediction ability of trained network has been verified experimentally. Results indicate that ANN can predict the values of performance measures of EDM satisfactorily.Keywords: artificial neural network, EDM, metal removal rate, modeling, surface roughness
Procedia PDF Downloads 41218092 Analysis of Caffeic Acid from Myrica nagi Leaves by High Performance Liquid Chromatography
Authors: Preeti Panthari, Harsha Kharkwal
Abstract:
Myrica nagi belongs to Myricaceae family. It is known for its therapeutic use since ancient times. The leaves were extracted with methanol and further fractioned with different solvents with increasing polarity. The n-butanol fraction of methanol extract was passed through celite, on separation through silica gel column chromatography yielded ten fractions. For the first time we report isolation of Caffeic acid from n-butanol fraction of Myrica nagi leaves in Chloroform: methanol (70:30) fraction. The mobile phase used for analysis in HPLC was Methanol: water (60:40) at the flow rate of 1 ml/min at wavelength of 280 nm. The retention time was 2.66 mins.Keywords: Myrica nagi, column chromatography, retention time, caffeic acid
Procedia PDF Downloads 55318091 Removal of Nitrogen Compounds from Industrial Wastewater Using Sequencing Batch Reactor: The Effects of React Time
Authors: Ali W. Alattabi, Khalid S. Hashim, Hassnen M. Jafer, Ali Alzeyadi
Abstract:
This study was performed to optimise the react time (RT) and study its effects on the removal rates of nitrogen compounds in a sequencing batch reactor (SBR) treating synthetic industrial wastewater. The results showed that increasing the RT from 4 h to 10, 16 and 22 h significantly improved the nitrogen compounds’ removal efficiency, it was increased from 69.5% to 95%, 75.7 to 97% and from 54.2 to 80.1% for NH3-N, NO3-N and NO2-N respectively. The results obtained from this study showed that the RT of 22 h was the optimum for nitrogen compounds removal efficiency.Keywords: ammonia-nitrogen, retention time, nitrate, nitrite, sequencing batch reactor, sludge characteristics
Procedia PDF Downloads 36318090 Credit Risk Prediction Based on Bayesian Estimation of Logistic Regression Model with Random Effects
Authors: Sami Mestiri, Abdeljelil Farhat
Abstract:
The aim of this current paper is to predict the credit risk of banks in Tunisia, over the period (2000-2005). For this purpose, two methods for the estimation of the logistic regression model with random effects: Penalized Quasi Likelihood (PQL) method and Gibbs Sampler algorithm are applied. By using the information on a sample of 528 Tunisian firms and 26 financial ratios, we show that Bayesian approach improves the quality of model predictions in terms of good classification as well as by the ROC curve result.Keywords: forecasting, credit risk, Penalized Quasi Likelihood, Gibbs Sampler, logistic regression with random effects, curve ROC
Procedia PDF Downloads 54218089 Predicting the Success of Bank Telemarketing Using Artificial Neural Network
Authors: Mokrane Selma
Abstract:
The shift towards decision making (DM) based on artificial intelligence (AI) techniques will change the way in which consumer markets and our societies function. Through AI, predictive analytics is being used by businesses to identify these patterns and major trends with the objective to improve the DM and influence future business outcomes. This paper proposes an Artificial Neural Network (ANN) approach to predict the success of telemarketing calls for selling bank long-term deposits. To validate the proposed model, we uses the bank marketing data of 41188 phone calls. The ANN attains 98.93% of accuracy which outperforms other conventional classifiers and confirms that it is credible and valuable approach for telemarketing campaign managers.Keywords: bank telemarketing, prediction, decision making, artificial intelligence, artificial neural network
Procedia PDF Downloads 15918088 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards
Authors: Golnush Masghati-Amoli, Paul Chin
Abstract:
Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering
Procedia PDF Downloads 13418087 Slip Limit Prediction of High-Strength Bolt Joints Based on Local Approach
Authors: Chang He, Hiroshi Tamura, Hiroshi Katsuchi, Jiaqi Wang
Abstract:
In this study, the aim is to infer the slip limit (static friction limit) of contact interfaces in bolt friction joints by analyzing other bolt friction joints with the same contact surface but in a different shape. By using the Weibull distribution to deal with microelements on the contact surface statistically, the slip limit of a certain type of bolt joint was predicted from other types of bolt joint with the same contact surface. As a result, this research succeeded in predicting the slip limit of bolt joins with different numbers of contact surfaces and with different numbers of bolt rows.Keywords: bolt joints, slip coefficient, finite element method, Weibull distribution
Procedia PDF Downloads 17018086 A Research on Tourism Market Forecast and Its Evaluation
Authors: Min Wei
Abstract:
The traditional prediction methods of the forecast for tourism market are paid more attention to the accuracy of the forecasts, ignoring the results of the feasibility of forecasting and predicting operability, which had made it difficult to predict the results of scientific testing. With the application of Linear Regression Model, this paper attempts to construct a scientific evaluation system for predictive value, both to ensure the accuracy, stability of the predicted value, and to ensure the feasibility of forecasting and predicting the results of operation. The findings show is that a scientific evaluation system can implement the scientific concept of development, the harmonious development of man and nature co-ordinate.Keywords: linear regression model, tourism market, forecast, tourism economics
Procedia PDF Downloads 33218085 Digital Twin for a Floating Solar Energy System with Experimental Data Mining and AI Modelling
Authors: Danlei Yang, Luofeng Huang
Abstract:
The integration of digital twin technology with renewable energy systems offers an innovative approach to predicting and optimising performance throughout the entire lifecycle. A digital twin is a continuously updated virtual replica of a real-world entity, synchronised with data from its physical counterpart and environment. Many digital twin companies today claim to have mature digital twin products, but their focus is primarily on equipment visualisation. However, the core of a digital twin should be its model, which can mirror, shadow, and thread with the real-world entity, which is still underdeveloped. For a floating solar energy system, a digital twin model can be defined in three aspects: (a) the physical floating solar energy system along with environmental factors such as solar irradiance and wave dynamics, (b) a digital model powered by artificial intelligence (AI) algorithms, and (c) the integration of real system data with the AI-driven model and a user interface. The experimental setup for the floating solar energy system, is designed to replicate real-ocean conditions of floating solar installations within a controlled laboratory environment. The system consists of a water tank that simulates an aquatic surface, where a floating catamaran structure supports a solar panel. The solar simulator is set up in three positions: one directly above and two inclined at a 45° angle in front and behind the solar panel. This arrangement allows the simulation of different sun angles, such as sunrise, midday, and sunset. The solar simulator is positioned 400 mm away from the solar panel to maintain consistent solar irradiance on its surface. Stability for the floating structure is achieved through ropes attached to anchors at the bottom of the tank, which simulates the mooring systems used in real-world floating solar applications. The floating solar energy system's sensor setup includes various devices to monitor environmental and operational parameters. An irradiance sensor measures solar irradiance on the photovoltaic (PV) panel. Temperature sensors monitor ambient air and water temperatures, as well as the PV panel temperature. Wave gauges measure wave height, while load cells capture mooring force. Inclinometers and ultrasonic sensors record heave and pitch amplitudes of the floating system’s motions. An electric load measures the voltage and current output from the solar panel. All sensors collect data simultaneously. Artificial neural network (ANN) algorithms are central to developing the digital model, which processes historical and real-time data, identifies patterns, and predicts the system’s performance in real time. The data collected from various sensors are partly used to train the digital model, with the remaining data reserved for validation and testing. The digital twin model combines the experimental setup with the ANN model, enabling monitoring, analysis, and prediction of the floating solar energy system's operation. The digital model mirrors the functionality of the physical setup, running in sync with the experiment to provide real-time insights and predictions. It provides useful industrial benefits, such as informing maintenance plans as well as design and control strategies for optimal energy efficiency. In long term, this digital twin will help improve overall solar energy yield whilst minimising the operational costs and risks.Keywords: digital twin, floating solar energy system, experiment setup, artificial intelligence
Procedia PDF Downloads 818084 Modeling SET Effect on Charge Pump Phase Locked Loop
Authors: Varsha Prasad, S. Sandya
Abstract:
Cosmic Ray effects in microelectronics such as single event effect (SET) and total dose ionization (TID) have been of major concern in space electronics since 1970. Advanced CMOS technologies have demonstrated reduced sensitivity to TID effect. However, charge pump Phase Locked Loop is very much vulnerable to single event transient effect. This paper presents an SET analysis model, where the SET is modeled as a double exponential pulse. The time domain analysis reveals that the settling time of the voltage controlled oscillator (VCO) depends on the SET pulse strength, setting the time constant and the damping factor. The analysis of the proposed SET analysis model is confirmed by the simulation results.Keywords: charge pump, phase locked loop, SET, VCO
Procedia PDF Downloads 43318083 Optimized Microwave Pretreatment of Rice Straw for Conversion into Lignin Free and High Crystalline Cellulose
Authors: Mohd Ishfaq Bhat, Navin Chandra Shahi, Umesh Chandra Lohani
Abstract:
The present study aimed to evaluate the effect of microwave application in synergy with the conventional sodium chlorite delignification of rice straw biomass. For the study, Box-Behnken experimental design involving four independent parameters, each with three levels viz. microwave power (480-800 W), irradiation time (4-12 min), bleaching solution concentration (0.4-3.0%), and bleaching time (1-5h) was used. The response was taken in the form of delignification percentage. The optimization of process parameters was done through response surface methodology. The respective optimum parameters of microwave power, irradiation time, bleaching solution concentration, and bleaching time were obtained as 671 W, 8.66 min, 2.67%, and 1h. The delignification percentage achieved at optimum conditions was 93.51%. The spectral, morphological, and x-ray diffraction characteristics of the rice straw powder after delignification showed a complete absence of lignin peaks, deconstruction of lignocellulose complex, and an increase of crystallinity (from 39.8 to 61.6 %).Keywords: lignocellulosic biomass, delignification, microwaves, rice straw, characterization
Procedia PDF Downloads 14718082 Time Temperature Indicator for Monitoring Freshness of Packed Pasteurized Milk
Authors: Rajeshwar S. Matche, Subhash V. Pawde, Suraj P, Sachin R. Chaudhari
Abstract:
Time Temperature Indicator’s (TTI) are trending approach in a food packaging that will be insightful to have safe and hygienic food products. Currently, available TTI in the market are mostly a product specific and sometime even difficult to handle especially in supply chain as these are pre-activated and require specific storage conditions. In the present study, research focus is on the development of a cost-effective lactic acid based TTI that can work over a wide range of temperature and can be activated at time of packaging or on demand. The correlation between activation energies of colour change of the developed indicator and packed pasteurized milk spoilage with respect to time and temperature was established. Developed lactic acid based TTI strips have range of activation energy from 10.13 to 24.20 KJ/mol. We found that the developed TTI strip’s with activation energy 12.42, and 14.41KJ/mol can be correlated with spoilage activation energy of packed pasteurized milk which was 25.71 KJ/mol with factor of 2 at storage temperature 4°C. The implementation of these TTI on packed pasteurized milk allow us see visual colour change during the storage and can be fruitful to monitoring quality of the milk and understand its freshness especially in a cold supply chain, viz distributor and road vendor etc.Keywords: pasteurised packed milk, time temperature indicator, spoilage, freshness
Procedia PDF Downloads 11018081 Modeling of Bed Level Changes in Larak Island
Authors: Saeed Zeinali, Nasser Talebbeydokhti, Mehdi Saeidian, Shahrad Vosough
Abstract:
In this article, bathymetry changes have been studied as a case study for Larak Island, located in The South of Iran. The advanced 2D model of Mike21 has been used for this purpose. A simple procedure has been utilized in this model. First, the hydrodynamic (HD) module of Mike21 has been used to obtain the required output for sediment transport model (ST module). The ST module modeled the area for tidal currents only. Bed level changes are resulted by series of modeling for both HD and ST module in 3 months time step. The final bathymetry in each time step is used as the primary bathymetry for next time step. This consecutive procedure been continued until bathymetry for the year 2020 is obtained.Keywords: bed level changes, Larak Island, hydrodynamic, sediment transport
Procedia PDF Downloads 26718080 A Value-Oriented Metamodel for Small and Medium Enterprises’ Decision Making
Authors: Romain Ben Taleb, Aurélie Montarnal, Matthieu Lauras, Mathieu Dahan, Romain Miclo
Abstract:
To be competitive and sustainable, any company has to maximize its value. However, unlike listed companies that can assess their values based on market shares, most Small and Medium Enterprises (SMEs) which are non-listed cannot have direct and live access to this critical information. Traditional accounting reports only give limited insights to SME decision-makers about the real impact of their day-to-day decisions on the company’s performance and value. Most of the time, an SME’s financial valuation is made one time a year as the associated process is time and resource-consuming, requiring several months and external expertise to be completed. To solve this issue, we propose in this paper a value-oriented metamodel that enables real-time and dynamic assessment of the SME’s value based on the large definition of their assets. These assets cover a wider scope of resources of the company and better account for immaterial assets. The proposal, which is illustrated in a case study, discusses the benefits of incorporating assets in the SME valuation.Keywords: SME, metamodel, decision support system, financial valuation, assets
Procedia PDF Downloads 9218079 Air Breakdown Voltage Prediction in Post-arcing Conditions for Compact Circuit Breakers
Authors: Jing Nan
Abstract:
The air breakdown voltage in compact circuit breakers is a critical factor in the design and reliability of electrical distribution systems. This voltage determines the threshold at which the air insulation between conductors will fail or 'break down,' leading to an arc. This phenomenon is highly sensitive to the conditions within the breaker, such as the temperature and the distance between electrodes. Typically, air breakdown voltage models have been reliable for predicting failure under standard operational temperatures. However, in conditions post-arcing, where temperatures can soar above 2000K, these models face challenges due to the complex physics of ionization and electron behaviour at such high-energy states. Building upon the foundational understanding that the breakdown mechanism is initiated by free electrons and propelled by electric fields, which lead to ionization and, potentially, to avalanche or streamer formation, we acknowledge the complexity introduced by high-temperature environments. Recognizing the limitations of existing experimental data, a notable research gap exists in the accurate prediction of breakdown voltage at elevated temperatures, typically observed post-arcing, where temperatures exceed 2000K.To bridge this knowledge gap, we present a method that integrates gap distance and high-temperature effects into air breakdown voltage assessment. The proposed model is grounded in the physics of ionization, accounting for the dynamic behaviour of free electrons which, under intense electric fields at elevated temperatures, lead to thermal ionization and potentially reach the threshold for streamer formation as Meek's criterion. Employing the Saha equation, our model calculates equilibrium electron densities, adapting to the atmospheric pressure and the hot temperature regions indicative of post-arc temperature conditions. Our model is rigorously validated against established experimental data, demonstrating substantial improvements in predicting air breakdown voltage in the high-temperature regime. This work significantly improves the predictive power for air breakdown voltage under conditions that closely mimic operational stressors in compact circuit breakers. Looking ahead, the proposed methods are poised for further exploration in alternative insulating media, like SF6, enhancing the model's utility for a broader range of insulation technologies and contributing to the future of high-temperature electrical insulation research.Keywords: air breakdown voltage, high-temperature insulation, compact circuit breakers, electrical discharge, saha equation
Procedia PDF Downloads 8418078 Investigation of Zinc Corrosion in Tropical Soil Solution
Authors: M. Lebrini, L. Salhi, C. Deyrat, C. Roos, O. Nait-Rabah
Abstract:
The paper presents a large experimental study on the corrosion of zinc in tropical soil and in the ground water at the various depths. Through this study, the corrosion rate prediction was done on the basis of two methods the electrochemical method and the gravimetric. The electrochemical results showed that the corrosion rate is more important at the depth levels 0 m to 0.5 m and 0.5 m to 1 m and beyond these depth levels, the corrosion rate is less important. The electrochemical results indicated also that a passive layer is formed on the zinc surface. The found SEM and EDX micrographs displayed that the surface is extremely attacked and confirmed that a zinc oxide layer is present on the surface whose thickness and relief increase as the contact with soil increases.Keywords: soil corrosion, galvanized steel, electrochemical technique, SEM and EDX
Procedia PDF Downloads 12718077 Comparison of Different Machine Learning Models for Time-Series Based Load Forecasting of Electric Vehicle Charging Stations
Authors: H. J. Joshi, Satyajeet Patil, Parth Dandavate, Mihir Kulkarni, Harshita Agrawal
Abstract:
As the world looks towards a sustainable future, electric vehicles have become increasingly popular. Millions worldwide are looking to switch to Electric cars over the previously favored combustion engine-powered cars. This demand has seen an increase in Electric Vehicle Charging Stations. The big challenge is that the randomness of electrical energy makes it tough for these charging stations to provide an adequate amount of energy over a specific amount of time. Thus, it has become increasingly crucial to model these patterns and forecast the energy needs of power stations. This paper aims to analyze how different machine learning models perform on Electric Vehicle charging time-series data. The data set consists of authentic Electric Vehicle Data from the Netherlands. It has an overview of ten thousand transactions from public stations operated by EVnetNL.Keywords: forecasting, smart grid, electric vehicle load forecasting, machine learning, time series forecasting
Procedia PDF Downloads 10618076 CPU Architecture Based on Static Hardware Scheduler Engine and Multiple Pipeline Registers
Authors: Ionel Zagan, Vasile Gheorghita Gaitan
Abstract:
The development of CPUs and of real-time systems based on them made it possible to use time at increasingly low resolutions. Together with the scheduling methods and algorithms, time organizing has been improved so as to respond positively to the need for optimization and to the way in which the CPU is used. This presentation contains both a detailed theoretical description and the results obtained from research on improving the performances of the nMPRA (Multi Pipeline Register Architecture) processor by implementing specific functions in hardware. The proposed CPU architecture has been developed, simulated and validated by using the FPGA Virtex-7 circuit, via a SoC project. Although the nMPRA processor hardware structure with five pipeline stages is very complex, the present paper presents and analyzes the tests dedicated to the implementation of the CPU and of the memory on-chip for instructions and data. In order to practically implement and test the entire SoC project, various tests have been performed. These tests have been performed in order to verify the drivers for peripherals and the boot module named Bootloader.Keywords: hardware scheduler, nMPRA processor, real-time systems, scheduling methods
Procedia PDF Downloads 26718075 Speeding up Nonlinear Time History Analysis of Base-Isolated Structures Using a Nonlinear Exponential Model
Authors: Nicolò Vaiana, Giorgio Serino
Abstract:
The nonlinear time history analysis of seismically base-isolated structures can require a significant computational effort when the behavior of each seismic isolator is predicted by adopting the widely used differential equation Bouc-Wen model. In this paper, a nonlinear exponential model, able to simulate the response of seismic isolation bearings within a relatively large displacements range, is described and adopted in order to reduce the numerical computations and speed up the nonlinear dynamic analysis. Compared to the Bouc-Wen model, the proposed one does not require the numerical solution of a nonlinear differential equation for each time step of the analysis. The seismic response of a 3d base-isolated structure with a lead rubber bearing system subjected to harmonic earthquake excitation is simulated by modeling each isolator using the proposed analytical model. The comparison of the numerical results and computational time with those obtained by modeling the lead rubber bearings using the Bouc-Wen model demonstrates the good accuracy of the proposed model and its capability to reduce significantly the computational effort of the analysis.Keywords: base isolation, computational efficiency, nonlinear exponential model, nonlinear time history analysis
Procedia PDF Downloads 38418074 Enhancement Dynamic Cars Detection Based on Optimized HOG Descriptor
Authors: Mansouri Nabila, Ben Jemaa Yousra, Motamed Cina, Watelain Eric
Abstract:
Research and development efforts in intelligent Advanced Driver Assistance Systems (ADAS) seek to save lives and reduce the number of on-road fatalities. For traffic and emergency monitoring, the essential but challenging task is vehicle detection and tracking in reasonably short time. This purpose needs first of all a powerful dynamic car detector model. In fact, this paper presents an optimized HOG process based on shape and motion parameters fusion. Our proposed approach mains to compute HOG by bloc feature from foreground blobs using configurable research window and pathway in order to overcome the shortcoming in term of computing time of HOG descriptor and improve their dynamic application performance. Indeed we prove in this paper that HOG by bloc descriptor combined with motion parameters is a very suitable car detector which reaches in record time a satisfactory recognition rate in dynamic outside area and bypasses several popular works without using sophisticated and expensive architectures such as GPU and FPGA.Keywords: car-detector, HOG, motion, computing time
Procedia PDF Downloads 32318073 Time Effective Structural Frequency Response Testing with Oblique Impact
Authors: Khoo Shin Yee, Lian Yee Cheng, Ong Zhi Chao, Zubaidah Ismail, Siamak Noroozi
Abstract:
Structural frequency response testing is accurate in identifying the dynamic characteristic of a machinery structure. In practical perspective, conventional structural frequency response testing such as experimental modal analysis with impulse technique (also known as “impulse testing”) has limitation especially on its long acquisition time. The high acquisition time is mainly due to the redundancy procedure where the engineer has to repeatedly perform the test in 3 directions, namely the axial-, horizontal- and vertical-axis, in order to comprehensively define the dynamic behavior of a 3D structure. This is unfavorable to numerous industries where the downtime cost is high. This study proposes to reduce the testing time by using oblique impact. Theoretically, a single oblique impact can induce significant vibration responses and vibration modes in all the 3 directions. Hence, the acquisition time with the implementation of the oblique impulse technique can be reduced by a factor of three (i.e. for a 3D dynamic system). This study initiates an experimental investigation of impulse testing with oblique excitation. A motor-driven test rig has been used for the testing purpose. Its dynamic characteristic has been identified using the impulse testing with the conventional normal impact and the proposed oblique impact respectively. The results show that the proposed oblique impulse testing is able to obtain all the desired natural frequencies in all 3 directions and thus providing a feasible solution for a fast and time effective way of conducting the impulse testing.Keywords: frequency response function, impact testing, modal analysis, oblique angle, oblique impact
Procedia PDF Downloads 50118072 Quantum Dynamics for General Time-Dependent Three Coupled Oscillators
Authors: Salah Menouar, Sara Hassoul
Abstract:
The dynamic of time-dependent three coupled oscillators is studied through an approach based on decoupling of them using the unitary transformation method. From a first unitary transformation, the Hamiltonian of the complicated original system is transformed to an equal but a simple one associated with the three coupled oscillators of which masses are unity. Finally, we diagonalize the matrix representation of the transformed hamiltonian by using a unitary matrix. The diagonalized Hamiltonian is just the same as the Hamiltonian of three simple oscillators. Through these procedures, the coupled oscillatory subsystems are completely decoupled. From this uncouplement, we can develop complete dynamics of the whole system in an easy way by just examining each oscillator independently. Such a development of the mechanical theory can be done regardless of the complication of the parameters' variations.Keywords: schrödinger equation, hamiltonian, time-dependent three coupled oscillators, unitary transformation
Procedia PDF Downloads 9818071 Grid and Market Integration of Large Scale Wind Farms using Advanced Predictive Data Mining Techniques
Authors: Umit Cali
Abstract:
The integration of intermittent energy sources like wind farms into the electricity grid has become an important challenge for the utilization and control of electric power systems, because of the fluctuating behaviour of wind power generation. Wind power predictions improve the economic and technical integration of large amounts of wind energy into the existing electricity grid. Trading, balancing, grid operation, controllability and safety issues increase the importance of predicting power output from wind power operators. Therefore, wind power forecasting systems have to be integrated into the monitoring and control systems of the transmission system operator (TSO) and wind farm operators/traders. The wind forecasts are relatively precise for the time period of only a few hours, and, therefore, relevant with regard to Spot and Intraday markets. In this work predictive data mining techniques are applied to identify a statistical and neural network model or set of models that can be used to predict wind power output of large onshore and offshore wind farms. These advanced data analytic methods helps us to amalgamate the information in very large meteorological, oceanographic and SCADA data sets into useful information and manageable systems. Accurate wind power forecasts are beneficial for wind plant operators, utility operators, and utility customers. An accurate forecast allows grid operators to schedule economically efficient generation to meet the demand of electrical customers. This study is also dedicated to an in-depth consideration of issues such as the comparison of day ahead and the short-term wind power forecasting results, determination of the accuracy of the wind power prediction and the evaluation of the energy economic and technical benefits of wind power forecasting.Keywords: renewable energy sources, wind power, forecasting, data mining, big data, artificial intelligence, energy economics, power trading, power grids
Procedia PDF Downloads 51818070 Design and Development of a Platform for Analyzing Spatio-Temporal Data from Wireless Sensor Networks
Authors: Walid Fantazi
Abstract:
The development of sensor technology (such as microelectromechanical systems (MEMS), wireless communications, embedded systems, distributed processing and wireless sensor applications) has contributed to a broad range of WSN applications which are capable of collecting a large amount of spatiotemporal data in real time. These systems require real-time data processing to manage storage in real time and query the data they process. In order to cover these needs, we propose in this paper a Snapshot spatiotemporal data model based on object-oriented concepts. This model allows saving storing and reducing data redundancy which makes it easier to execute spatiotemporal queries and save analyzes time. Further, to ensure the robustness of the system as well as the elimination of congestion from the main access memory we propose a spatiotemporal indexing technique in RAM called Captree *. As a result, we offer an RIA (Rich Internet Application) -based SOA application architecture which allows the remote monitoring and control.Keywords: WSN, indexing data, SOA, RIA, geographic information system
Procedia PDF Downloads 25318069 Optimizing Communications Overhead in Heterogeneous Distributed Data Streams
Authors: Rashi Bhalla, Russel Pears, M. Asif Naeem
Abstract:
In this 'Information Explosion Era' analyzing data 'a critical commodity' and mining knowledge from vertically distributed data stream incurs huge communication cost. However, an effort to decrease the communication in the distributed environment has an adverse influence on the classification accuracy; therefore, a research challenge lies in maintaining a balance between transmission cost and accuracy. This paper proposes a method based on Bayesian inference to reduce the communication volume in a heterogeneous distributed environment while retaining prediction accuracy. Our experimental evaluation reveals that a significant reduction in communication can be achieved across a diverse range of dataset types.Keywords: big data, bayesian inference, distributed data stream mining, heterogeneous-distributed data
Procedia PDF Downloads 161