Search results for: climatological weather data measurement
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26825

Search results for: climatological weather data measurement

26675 Data-Driven Crop Advisory – A Use Case on Grapes

Authors: Shailaja Grover, Purvi Tiwari, Vigneshwaran S. R., U. Dinesh Kumar

Abstract:

In India, grapes are one of the most important horticulture crops. Grapes are most vulnerable to downy mildew, which is one of the most devasting diseases. In the absence of a precise weather-based advisory system, farmers spray pesticides on their crops extensively. There are two main challenges associated with using these pesticides. Firstly, most of these sprays were panic sprays, which could have been avoided. Second, farmers use more expensive "Preventive and Eradicate" chemicals than "Systemic, Curative and Anti-sporulate" chemicals. When these chemicals are used indiscriminately, they can enter the fruit and cause health problems such as cancer. This paper utilizes decision trees and predictive modeling techniques to provide grape farmers with customized advice on grape disease management. This model is expected to reduce the overall use of chemicals by approximately 50% and the cost by around 70%. Most of the grapes produced will have relatively low residue levels of pesticides, i.e., below the permissible level.

Keywords: analytics in agriculture, downy mildew, weather based advisory, decision tree, predictive modelling

Procedia PDF Downloads 71
26674 Neural Synchronization - The Brain’s Transfer of Sensory Data

Authors: David Edgar

Abstract:

To understand how the brain’s subconscious and conscious functions, we must conquer the physics of Unity, which leads to duality’s algorithm. Where the subconscious (bottom-up) and conscious (top-down) processes function together to produce and consume intelligence, we use terms like ‘time is relative,’ but we really do understand the meaning. In the brain, there are different processes and, therefore, different observers. These different processes experience time at different rates. A sensory system such as the eyes cycles measurement around 33 milliseconds, the conscious process of the frontal lobe cycles at 300 milliseconds, and the subconscious process of the thalamus cycle at 5 milliseconds. Three different observers experience time differently. To bridge observers, the thalamus, which is the fastest of the processes, maintains a synchronous state and entangles the different components of the brain’s physical process. The entanglements form a synchronous cohesion between the brain components allowing them to share the same state and execute in the same measurement cycle. The thalamus uses the shared state to control the firing sequence of the brain’s linear subconscious process. Sharing state also allows the brain to cheat on the amount of sensory data that must be exchanged between components. Only unpredictable motion is transferred through the synchronous state because predictable motion already exists in the shared framework. The brain’s synchronous subconscious process is entirely based on energy conservation, where prediction regulates energy usage. So, the eyes every 33 milliseconds dump their sensory data into the thalamus every day. The thalamus is going to perform a motion measurement to identify the unpredictable motion in the sensory data. Here is the trick. The thalamus conducts its measurement based on the original observation time of the sensory system (33 ms), not its own process time (5 ms). This creates a data payload of synchronous motion that preserves the original sensory observation. Basically, a frozen moment in time (Flat 4D). The single moment in time can then be processed through the single state maintained by the synchronous process. Other processes, such as consciousness (300 ms), can interface with the synchronous state to generate awareness of that moment. Now, synchronous data traveling through a separate faster synchronous process creates a theoretical time tunnel where observation time is tunneled through the synchronous process and is reproduced on the other side in the original time-relativity. The synchronous process eliminates time dilation by simply removing itself from the equation so that its own process time does not alter the experience. To the original observer, the measurement appears to be instantaneous, but in the thalamus, a linear subconscious process generating sensory perception and thought production is being executed. It is all just occurring in the time available because other observation times are slower than thalamic measurement time. For life to exist in the physical universe requires a linear measurement process, it just hides by operating at a faster time relativity. What’s interesting is time dilation is not the problem; it’s the solution. Einstein said there was no universal time.

Keywords: neural synchronization, natural intelligence, 99.95% IoT data transmission savings, artificial subconscious intelligence (ASI)

Procedia PDF Downloads 119
26673 Forecasting Thermal Energy Demand in District Heating and Cooling Systems Using Long Short-Term Memory Neural Networks

Authors: Kostas Kouvaris, Anastasia Eleftheriou, Georgios A. Sarantitis, Apostolos Chondronasios

Abstract:

To achieve the objective of almost zero carbon energy solutions by 2050, the EU needs to accelerate the development of integrated, highly efficient and environmentally friendly solutions. In this direction, district heating and cooling (DHC) emerges as a viable and more efficient alternative to conventional, decentralized heating and cooling systems, enabling a combination of more efficient renewable and competitive energy supplies. In this paper, we develop a forecasting tool for near real-time local weather and thermal energy demand predictions for an entire DHC network. In this fashion, we are able to extend the functionality and to improve the energy efficiency of the DHC network by predicting and adjusting the heat load that is distributed from the heat generation plant to the connected buildings by the heat pipe network. Two case-studies are considered; one for Vransko, Slovenia and one for Montpellier, France. The data consists of i) local weather data, such as humidity, temperature, and precipitation, ii) weather forecast data, such as the outdoor temperature and iii) DHC operational parameters, such as the mass flow rate, supply and return temperature. The external temperature is found to be the most important energy-related variable for space conditioning, and thus it is used as an external parameter for the energy demand models. For the development of the forecasting tool, we use state-of-the-art deep neural networks and more specifically, recurrent networks with long-short-term memory cells, which are able to capture complex non-linear relations among temporal variables. Firstly, we develop models to forecast outdoor temperatures for the next 24 hours using local weather data for each case-study. Subsequently, we develop models to forecast thermal demand for the same period, taking under consideration past energy demand values as well as the predicted temperature values from the weather forecasting models. The contributions to the scientific and industrial community are three-fold, and the empirical results are highly encouraging. First, we are able to predict future thermal demand levels for the two locations under consideration with minimal errors. Second, we examine the impact of the outdoor temperature on the predictive ability of the models and how the accuracy of the energy demand forecasts decreases with the forecast horizon. Third, we extend the relevant literature with a new dataset of thermal demand and examine the performance and applicability of machine learning techniques to solve real-world problems. Overall, the solution proposed in this paper is in accordance with EU targets, providing an automated smart energy management system, decreasing human errors and reducing excessive energy production.

Keywords: machine learning, LSTMs, district heating and cooling system, thermal demand

Procedia PDF Downloads 135
26672 Electric Arc Furnaces as a Source of Voltage Fluctuations in the Power System

Authors: Zbigniew Olczykowski

Abstract:

The paper presents the impact of work on the electric arc furnace power grid. The arc furnace operating will be modeled at different power conditions of steelworks. The paper will describe how to determine the increase in voltage fluctuations caused by working in parallel arc furnaces. The analysis of indicators characterizing the quality of electricity recorded during several cycles of measurement made at the same time at three points grid, with different power and different short-circuit rated voltage, will be carried out. The measurements analysis presented in this paper were conducted in the mains of one of the Polish steel. The indicators characterizing the quality of electricity was recorded during several cycles of measurement while making measurements at three points of different power network short-circuit power and various voltage ratings. Measurements of power quality indices included the one-week measurement cycles in accordance with the EN-50160. Data analysis will include the results obtained during the simultaneous measurement of three-point grid. This will determine the actual propagation of interference generated by the device. Based on the model studies and measurements of quality indices of electricity we will establish the effect of a specific arc on the mains. The short-circuit power network’s minimum value will also be estimated, this is necessary to limit the voltage fluctuations generated by arc furnaces.

Keywords: arc furnaces, long-term flicker, measurement and modeling of power quality, voltage fluctuations

Procedia PDF Downloads 282
26671 Inference for Compound Truncated Poisson Lognormal Model with Application to Maximum Precipitation Data

Authors: M. Z. Raqab, Debasis Kundu, M. A. Meraou

Abstract:

In this paper, we have analyzed maximum precipitation data during a particular period of time obtained from different stations in the Global Historical Climatological Network of the USA. One important point to mention is that some stations are shut down on certain days for some reason or the other. Hence, the maximum values are recorded by excluding those readings. It is assumed that the number of stations that operate follows zero-truncated Poisson random variables, and the daily precipitation follows a lognormal random variable. We call this model a compound truncated Poisson lognormal model. The proposed model has three unknown parameters, and it can take a variety of shapes. The maximum likelihood estimators can be obtained quite conveniently using Expectation-Maximization (EM) algorithm. Approximate maximum likelihood estimators are also derived. The associated confidence intervals also can be obtained from the observed Fisher information matrix. Simulation results have been performed to check the performance of the EM algorithm, and it is observed that the EM algorithm works quite well in this case. When we analyze the precipitation data set using the proposed model, it is observed that the proposed model provides a better fit than some of the existing models.

Keywords: compound Poisson lognormal distribution, EM algorithm, maximum likelihood estimation, approximate maximum likelihood estimation, Fisher information, skew distribution

Procedia PDF Downloads 105
26670 Effect of Climate Variability on Children Health Outcomes in Rural Uganda

Authors: Emily Injete Amondo, Alisher Mirzabaev, Emmanuel Rukundo

Abstract:

Children in rural farming households are often vulnerable to a multitude of risks, including health risks associated with climate change and variability. Cognizant of this, this study empirically traced the relationship between climate variability and nutritional health outcomes in rural children while identifying the cause-and-effect transmission mechanisms. We combined four waves of the rich Uganda National Panel Survey (UNPS), part of the World Bank Living Standards Measurement Studies (LSMS) for the period 2009-2014, with long-term and high-frequency rainfall and temperature datasets. Self-reported drought and flood shock variables were further used in separate regressions for triangulation purposes and robustness checks. Panel fixed effects regressions were applied in the empirical analysis, accounting for a variety of causal identification issues. The results showed significant negative outcomes for children’s anthropometric measurements due to the impacts of moderate and extreme droughts, extreme wet spells, and heatwaves. On the contrary, moderate wet spells were positively linked with nutritional measures. Agricultural production and child diarrhea were the main transmission channels, with heatwaves, droughts, and high rainfall variability negatively affecting crop output. The probability of diarrhea was positively related to increases in temperature and dry spells. Results further revealed that children in households who engaged in ex-ante or anticipatory risk-reducing strategies such as savings had better health outcomes as opposed to those engaged in ex-post coping such as involuntary change of diet. These results highlight the importance of adaptation in smoothing the harmful effects of climate variability on the health of rural households and children in Uganda.

Keywords: extreme weather events, undernutrition, diarrhea, agricultural production, gridded weather data

Procedia PDF Downloads 97
26669 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques

Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah

Abstract:

Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or under-estimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improves accuracies. This requires standard measurement methods to be structured in ontologically and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.

Keywords: BIM, construction projects, cost estimation, NRM, ontology

Procedia PDF Downloads 547
26668 Subpixel Corner Detection for Monocular Camera Linear Model Research

Authors: Guorong Sui, Xingwei Jia, Fei Tong, Xiumin Gao

Abstract:

Camera calibration is a fundamental issue of high precision noncontact measurement. And it is necessary to analyze and study the reliability and application range of its linear model which is often used in the camera calibration. According to the imaging features of monocular cameras, a camera model which is based on the image pixel coordinates and three dimensional space coordinates is built. Using our own customized template, the image pixel coordinate is obtained by the subpixel corner detection method. Without considering the aberration of the optical system, the feature extraction and linearity analysis of the line segment in the template are performed. Moreover, the experiment is repeated 11 times by constantly varying the measuring distance. At last, the linearity of the camera is achieved by fitting 11 groups of data. The camera model measurement results show that the relative error does not exceed 1%, and the repeated measurement error is not more than 0.1 mm magnitude. Meanwhile, it is found that the model has some measurement differences in the different region and object distance. The experiment results show this linear model is simple and practical, and have good linearity within a certain object distance. These experiment results provide a powerful basis for establishment of the linear model of camera. These works will have potential value to the actual engineering measurement.

Keywords: camera linear model, geometric imaging relationship, image pixel coordinates, three dimensional space coordinates, sub-pixel corner detection

Procedia PDF Downloads 274
26667 Design of an Instrumentation Setup and Data Acquisition System for a GAS Turbine Engine Using Suitable DAQ Software

Authors: Syed Nauman Bin Asghar Bukhari, Mohtashim Mansoor, Mohammad Nouman

Abstract:

Engine test-Bed system is a fundamental tool to measure dynamic parameters, economic performance, and reliability of an aircraft Engine, and its automation and accuracy directly influences the precision of acquired and analysed data. In this paper, we present the design of digital Data Acquisition (DAQ) system for a vintage aircraft engine test bed that lacks the capability of displaying all the analyzed parameters at one convenient location (one panel-one screen). Recording such measurements in the vintage test bed is not only time consuming but also prone to human errors. Digitizing such measurement system requires a Data Acquisition (DAQ) system capable of recording these parameters and displaying them on one screen-one panel monitor. The challenge in designing upgrade to the vintage systems arises with a need to build and integrate digital measurement system from scratch with a minimal budget and modifications to the existing vintage system. The proposed design not only displays all the key performance / maintenance parameters of the gas turbine engines for operator as well as quality inspector on separate screens but also records the data for further processing / archiving.

Keywords: Gas turbine engine, engine test cell, data acquisition, instrumentation

Procedia PDF Downloads 120
26666 Evolution of Performance Measurement Methods in Conditions of Uncertainty: The Implementation of Fuzzy Sets in Performance Measurement

Authors: E. A. Tkachenko, E. M. Rogova, V. V. Klimov

Abstract:

One of the basic issues of development management is connected with performance measurement as a prerequisite for identifying the achievement of development objectives. The aim of our research is to develop an improved model of assessing a company’s development results. The model should take into account the cyclical nature of development and the high degree of uncertainty in dealing with numerous management tasks. Our hypotheses may be formulated as follows: Hypothesis 1. The cycle of a company’s development may be studied from the standpoint of a project cycle. To do that, methods and tools of project analysis are to be used. Hypothesis 2. The problem of the uncertainty when justifying managerial decisions within the framework of a company’s development cycle can be solved through the use of the mathematical apparatus of fuzzy logic. The reasoned justification of the validity of the hypotheses made is given in the suggested article. The fuzzy logic toolkit applies to the case of technology shift within an enterprise. It is proven that some restrictions in performance measurement that are incurred to conventional methods could be eliminated by implementation of the fuzzy logic apparatus in performance measurement models.

Keywords: logic, fuzzy sets, performance measurement, project analysis

Procedia PDF Downloads 373
26665 Quantifying Freeway Capacity Reductions by Rainfall Intensities Based on Stochastic Nature of Flow Breakdown

Authors: Hoyoung Lee, Dong-Kyu Kim, Seung-Young Kho, R. Eddie Wilson

Abstract:

This study quantifies a decrement in freeway capacity during rainfall. Traffic and rainfall data were gathered from Highway Agencies and Wunderground weather service. Three inter-urban freeway sections and its nearest weather stations were selected as experimental sites. Capacity analysis found reductions of maximum and mean pre-breakdown flow rates due to rainfall. The Kruskal-Wallis test also provided some evidence to suggest that the variance in the pre-breakdown flow rate is statistically insignificant. Potential application of this study lies in the operation of real time traffic management schemes such as Variable Speed Limits (VSL), Hard Shoulder Running (HSR), and Ramp Metering System (RMS), where speed or flow limits could be set based on a number of factors, including rainfall events and their intensities.

Keywords: capacity randomness, flow breakdown, freeway capacity, rainfall

Procedia PDF Downloads 377
26664 Towards an Effective Approach for Modelling near Surface Air Temperature Combining Weather and Satellite Data

Authors: Nicola Colaninno, Eugenio Morello

Abstract:

The urban environment affects local-to-global climate and, in turn, suffers global warming phenomena, with worrying impacts on human well-being, health, social and economic activities. Physic-morphological features of the built-up space affect urban air temperature, locally, causing the urban environment to be warmer compared to surrounding rural. This occurrence, typically known as the Urban Heat Island (UHI), is normally assessed by means of air temperature from fixed weather stations and/or traverse observations or based on remotely sensed Land Surface Temperatures (LST). The information provided by ground weather stations is key for assessing local air temperature. However, the spatial coverage is normally limited due to low density and uneven distribution of the stations. Although different interpolation techniques such as Inverse Distance Weighting (IDW), Ordinary Kriging (OK), or Multiple Linear Regression (MLR) are used to estimate air temperature from observed points, such an approach may not effectively reflect the real climatic conditions of an interpolated point. Quantifying local UHI for extensive areas based on weather stations’ observations only is not practicable. Alternatively, the use of thermal remote sensing has been widely investigated based on LST. Data from Landsat, ASTER, or MODIS have been extensively used. Indeed, LST has an indirect but significant influence on air temperatures. However, high-resolution near-surface air temperature (NSAT) is currently difficult to retrieve. Here we have experimented Geographically Weighted Regression (GWR) as an effective approach to enable NSAT estimation by accounting for spatial non-stationarity of the phenomenon. The model combines on-site measurements of air temperature, from fixed weather stations and satellite-derived LST. The approach is structured upon two main steps. First, a GWR model has been set to estimate NSAT at low resolution, by combining air temperature from discrete observations retrieved by weather stations (dependent variable) and the LST from satellite observations (predictor). At this step, MODIS data, from Terra satellite, at 1 kilometer of spatial resolution have been employed. Two time periods are considered according to satellite revisit period, i.e. 10:30 am and 9:30 pm. Afterward, the results have been downscaled at 30 meters of spatial resolution by setting a GWR model between the previously retrieved near-surface air temperature (dependent variable), the multispectral information as provided by the Landsat mission, in particular the albedo, and Digital Elevation Model (DEM) from the Shuttle Radar Topography Mission (SRTM), both at 30 meters. Albedo and DEM are now the predictors. The area under investigation is the Metropolitan City of Milan, which covers an area of approximately 1,575 km2 and encompasses a population of over 3 million inhabitants. Both models, low- (1 km) and high-resolution (30 meters), have been validated according to a cross-validation that relies on indicators such as R2, Root Mean Squared Error (RMSE) and Mean Absolute Error (MAE). All the employed indicators give evidence of highly efficient models. In addition, an alternative network of weather stations, available for the City of Milano only, has been employed for testing the accuracy of the predicted temperatures, giving and RMSE of 0.6 and 0.7 for daytime and night-time, respectively.

Keywords: urban climate, urban heat island, geographically weighted regression, remote sensing

Procedia PDF Downloads 190
26663 A Systematic Review on Measuring the Physical Activity Level and Pattern in Persons with Chronic Fatigue Syndrome

Authors: Kuni Vergauwen, Ivan P. J. Huijnen, Astrid Depuydt, Jasmine Van Regenmortel, Mira Meeus

Abstract:

A lower activity level and imbalanced activity pattern are frequently observed in persons with chronic fatigue syndrome (CFS) / myalgic encephalomyelitis (ME) due to debilitating fatigue and post-exertional malaise (PEM). Identification of measurement instruments to evaluate the activity level and pattern is therefore important. The objective is to identify measurement instruments suited to evaluate the activity level and/or pattern in patients with CFS/ME and review their psychometric properties. A systematic literature search was performed in the electronic databases PubMed and Web of Science until 12 October 2016. Articles including relevant measurement instruments were identified and included for further analysis. The psychometric properties of relevant measurement instruments were extracted from the included articles and rated based on the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. The review was performed and reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. A total of 49 articles and 15 unique measurement instruments were found, but only three instruments were evaluated in patients with CFS/ME: the Chronic Fatigue Syndrome-Activity Questionnaire (CFS-AQ), Activity Pattern Interview (API) and International Physical Activity Questionnaire-Short Form (IPAQ-SF), three self-report instruments measuring the physical activity level. The IPAQ-SF, CFS-AQ and API are all equally capable of evaluating the physical activity level, but none of the three measurement instruments are optimal to use. No studies about the psychometric properties of activity monitors in patients with CFS/ME were found, although they are often used as the gold standard to measure the physical activity pattern. More research is needed to evaluate the psychometric properties of existing instruments, including the use of activity monitors.

Keywords: chronic fatigue syndrome, data collection, physical activity, psychometrics

Procedia PDF Downloads 223
26662 Lean Impact Analysis Assessment Models: Development of a Lean Measurement Structural Model

Authors: Catherine Maware, Olufemi Adetunji

Abstract:

The paper is aimed at developing a model to measure the impact of Lean manufacturing deployment on organizational performance. The model will help industry practitioners to assess the impact of implementing Lean constructs on organizational performance. It will also harmonize the measurement models of Lean performance with the house of Lean that seems to have become the industry standard. The sheer number of measurement models for impact assessment of Lean implementation makes it difficult for new adopters to select an appropriate assessment model or deployment methodology. A literature review is conducted to classify the Lean performance model. Pareto analysis is used to select the Lean constructs for the development of the model. The model is further formalized through the use of Structural Equation Modeling (SEM) in defining the underlying latent structure of a Lean system. An impact assessment measurement model developed can be used to measure Lean performance and can be adopted by different industries.

Keywords: impact measurement model, lean bundles, lean manufacturing, organizational performance

Procedia PDF Downloads 478
26661 A Monocular Measurement for 3D Objects Based on Distance Area Number and New Minimize Projection Error Optimization Algorithms

Authors: Feixiang Zhao, Shuangcheng Jia, Qian Li

Abstract:

High-precision measurement of the target’s position and size is one of the hotspots in the field of vision inspection. This paper proposes a three-dimensional object positioning and measurement method using a monocular camera and GPS, namely the Distance Area Number-New Minimize Projection Error (DAN-NMPE). Our algorithm contains two parts: DAN and NMPE; specifically, DAN is a picture sequence algorithm, NMPE is a relatively positive optimization algorithm, which greatly improves the measurement accuracy of the target’s position and size. Comprehensive experiments validate the effectiveness of our proposed method on a self-made traffic sign dataset. The results show that with the laser point cloud as the ground truth, the size and position errors of the traffic sign measured by this method are ± 5% and 0.48 ± 0.3m, respectively. In addition, we also compared it with the current mainstream method, which uses a monocular camera to locate and measure traffic signs. DAN-NMPE attains significant improvements compared to existing state-of-the-art methods, which improves the measurement accuracy of size and position by 50% and 15.8%, respectively.

Keywords: monocular camera, GPS, positioning, measurement

Procedia PDF Downloads 136
26660 Wave Pressure Metering with the Specific Instrument and Measure Description Determined by the Shape and Surface of the Instrument including the Number of Sensors and Angle between Them

Authors: Branimir Jurun, Elza Jurun

Abstract:

Focus of this paper is description and functioning manner of the instrument for wave pressure metering. Moreover, an essential component of this paper is the proposal of a metering unit for the direct wave pressure measurement determined by the shape and surface of the instrument including the number of sensors and angle between them. Namely, far applied instruments by means of height, length, direction, wave time period and other components determine wave pressure on a particular area. This instrument, allows the direct measurement i.e. measurement without additional calculation, of the wave pressure expressed in a standardized unit of measure. That way the instrument has a standardized form, surface, number of sensors and the angle between them. In addition, it is made with the status that follows the wave and always is on the water surface. Database quality which is listed by the instrument is made possible by using the Arduino chip. This chip is programmed for receiving by two data from each of the sensors each second. From these data by a pre-defined manner a unique representative value is estimated. By this procedure all relevant wave pressure measurement results are directly and immediately registered. Final goal of establishing such a rich database is a comprehensive statistical analysis that ranges from multi-criteria analysis across different modeling and parameters testing to hypothesis accepting relating to the widest variety of man-made activities such as filling of beaches, security cages for aquaculture, bridges construction.

Keywords: instrument, metering, water, waves

Procedia PDF Downloads 258
26659 Development of pm2.5 Forecasting System in Seoul, South Korea Using Chemical Transport Modeling and ConvLSTM-DNN

Authors: Ji-Seok Koo, Hee‑Yong Kwon, Hui-Young Yun, Kyung-Hui Wang, Youn-Seo Koo

Abstract:

This paper presents a forecasting system for PM2.5 levels in Seoul, South Korea, leveraging a combination of chemical transport modeling and ConvLSTM-DNN machine learning technology. Exposure to PM2.5 has known detrimental impacts on public health, making its prediction crucial for establishing preventive measures. Existing forecasting models, like the Community Multiscale Air Quality (CMAQ) and Weather Research and Forecasting (WRF), are hindered by their reliance on uncertain input data, such as anthropogenic emissions and meteorological patterns, as well as certain intrinsic model limitations. The system we've developed specifically addresses these issues by integrating machine learning and using carefully selected input features that account for local and distant sources of PM2.5. In South Korea, the PM2.5 concentration is greatly influenced by both local emissions and long-range transport from China, and our model effectively captures these spatial and temporal dynamics. Our PM2.5 prediction system combines the strengths of advanced hybrid machine learning algorithms, convLSTM and DNN, to improve upon the limitations of the traditional CMAQ model. Data used in the system include forecasted information from CMAQ and WRF models, along with actual PM2.5 concentration and weather variable data from monitoring stations in China and South Korea. The system was implemented specifically for Seoul's PM2.5 forecasting.

Keywords: PM2.5 forecast, machine learning, convLSTM, DNN

Procedia PDF Downloads 52
26658 The Effects of Cooling during Baseball Games on Perceived Exertion and Core Temperature

Authors: Chih-Yang Liao

Abstract:

Baseball is usually played outdoors in the warmest months of the year. Therefore, baseball players are susceptible to the influence of the hot environment. It has been shown that hitting performance is increased in games played in warm weather, compared to in cold weather, in Major League Baseball. Intermittent cooling during sporting events can prevent the risk of hyperthermia and increase endurance performance. However, the effects of cooling during baseball games played in a hot environment are unclear. This study adopted a cross-over design. Ten Division I collegiate male baseball players in Taiwan volunteered to participate in this study. Each player played two simulated baseball games, with one day in between. Five of the players received intermittent cooling during the first simulated game, while the other five players received intermittent cooling during the second simulated game. The participants were covered in neck and forehand regions for 6 min with towels that were soaked in icy salt water 3 to 4 times during the games. The participants received the cooling treatment in the dugout when they were not on the field for defense or hitting. During the 2 simulated games, the temperature was 31.1-34.1°C and humidity was 58.2-61.8%, with no difference between the two games. Ratings of perceived exertion, thermal sensation, tympanic and forehead skin temperature immediately after each defensive half-inning and after cooling treatments were recorded. Ratings of perceived exertion were measured using the Borg 10-point scale. The thermal sensation was measured with a 6-point scale. The tympanic and skin temperature was measured with infrared thermometers. The data were analyzed with a two-way analysis of variance with repeated measurement. The results showed that intermitted cooling significantly reduced ratings of perceived exertion and thermal sensation. Forehead skin temperature was also significantly decreased after cooling treatments. However, the tympanic temperature was not significantly different between the two trials. In conclusion, intermittent cooling in the neck and forehead regions was effective in alleviating the perceived exertion and heat sensation. However, this cooling intervention did not affect the core temperature. Whether intermittent cooling has any impact on hitting or pitching performance in baseball players warrants further investigation.

Keywords: baseball, cooling, ratings of perceived exertion, thermal sensation

Procedia PDF Downloads 140
26657 Improved Soil and Snow Treatment with the Rapid Update Cycle Land-Surface Model for Regional and Global Weather Predictions

Authors: Tatiana G. Smirnova, Stan G. Benjamin

Abstract:

Rapid Update Cycle (RUC) land surface model (LSM) was a land-surface component in several generations of operational weather prediction models at the National Center for Environment Prediction (NCEP) at the National Oceanic and Atmospheric Administration (NOAA). It was designed for short-range weather predictions with an emphasis on severe weather and originally was intentionally simple to avoid uncertainties from poorly known parameters. Nevertheless, the RUC LSM, when coupled with the hourly-assimilating atmospheric model, can produce a realistic evolution of time-varying soil moisture and temperature, as well as the evolution of snow cover on the ground surface. This result is possible only if the soil/vegetation/snow component of the coupled weather prediction model has sufficient skill to avoid long-term drift. RUC LSM was first implemented in the operational NCEP Rapid Update Cycle (RUC) weather model in 1998 and later in the Weather Research Forecasting Model (WRF)-based Rapid Refresh (RAP) and High-resolution Rapid Refresh (HRRR). Being available to the international WRF community, it was implemented in operational weather models in Austria, New Zealand, and Switzerland. Based on the feedback from the US weather service offices and the international WRF community and also based on our own validation, RUC LSM has matured over the years. Also, a sea-ice module was added to RUC LSM for surface predictions over the Arctic sea-ice. Other modifications include refinements to the snow model and a more accurate specification of albedo, roughness length, and other surface properties. At present, RUC LSM is being tested in the regional application of the Unified Forecast System (UFS). The next generation UFS-based regional Rapid Refresh FV3 Standalone (RRFS) model will replace operational RAP and HRRR at NCEP. Over time, RUC LSM participated in several international model intercomparison projects to verify its skill using observed atmospheric forcing. The ESM-SnowMIP was the last of these experiments focused on the verification of snow models for open and forested regions. The simulations were performed for ten sites located in different climatic zones of the world forced with observed atmospheric conditions. While most of the 26 participating models have more sophisticated snow parameterizations than in RUC, RUC LSM got a high ranking in simulations of both snow water equivalent and surface temperature. However, ESM-SnowMIP experiment also revealed some issues in the RUC snow model, which will be addressed in this paper. One of them is the treatment of grid cells partially covered with snow. RUC snow module computes energy and moisture budgets of snow-covered and snow-free areas separately by aggregating the solutions at the end of each time step. Such treatment elevates the importance of computing in the model snow cover fraction. Improvements to the original simplistic threshold-based approach have been implemented and tested both offline and in the coupled weather model. The detailed description of changes to the snow cover fraction and other modifications to RUC soil and snow parameterizations will be described in this paper.

Keywords: land-surface models, weather prediction, hydrology, boundary-layer processes

Procedia PDF Downloads 82
26656 Performance Complexity Measurement of Tightening Equipment Based on Kolmogorov Entropy

Authors: Guoliang Fan, Aiping Li, Xuemei Liu, Liyun Xu

Abstract:

The performance of the tightening equipment will decline with the working process in manufacturing system. The main manifestations are the randomness and discretization degree increasing of the tightening performance. To evaluate the degradation tendency of the tightening performance accurately, a complexity measurement approach based on Kolmogorov entropy is presented. At first, the states of performance index are divided for calibrating the discrete degree. Then the complexity measurement model based on Kolmogorov entropy is built. The model describes the performance degradation tendency of tightening equipment quantitatively. At last, a study case is applied for verifying the efficiency and validity of the approach. The research achievement shows that the presented complexity measurement can effectively evaluate the degradation tendency of the tightening equipment. It can provide theoretical basis for preventive maintenance and life prediction of equipment.

Keywords: complexity measurement, Kolmogorov entropy, manufacturing system, performance evaluation, tightening equipment

Procedia PDF Downloads 258
26655 A Method for Measurement and Evaluation of Drape of Textiles

Authors: L. Fridrichova, R. Knížek, V. Bajzík

Abstract:

Drape is one of the important visual characteristics of the fabric. This paper is introducing an innovative method of measurement and evaluation of the drape shape of the fabric. The measuring principle is based on the possibility of multiple vertical strain of the fabric. This method more accurately simulates the real behavior of the fabric in the process of draping. The method is fully automated, so the sample can be measured by using any number of cycles in any time horizon. Using the present method of measurement, we are able to describe the viscoelastic behavior of the fabric.

Keywords: drape, drape shape, automated drapemeter, fabric

Procedia PDF Downloads 650
26654 Causal Inference Engine between Continuous Emission Monitoring System Combined with Air Pollution Forecast Modeling

Authors: Yu-Wen Chen, Szu-Wei Huang, Chung-Hsiang Mu, Kelvin Cheng

Abstract:

This paper developed a data-driven based model to deal with the causality between the Continuous Emission Monitoring System (CEMS, by Environmental Protection Administration, Taiwan) in industrial factories, and the air quality around environment. Compared to the heavy burden of traditional numerical models of regional weather and air pollution simulation, the lightweight burden of the proposed model can provide forecasting hourly with current observations of weather, air pollution and emissions from factories. The observation data are included wind speed, wind direction, relative humidity, temperature and others. The observations can be collected real time from Open APIs of civil IoT Taiwan, which are sourced from 439 weather stations, 10,193 qualitative air stations, 77 national quantitative stations and 140 CEMS quantitative industrial factories. This study completed a causal inference engine and gave an air pollution forecasting for the next 12 hours related to local industrial factories. The outcomes of the pollution forecasting are produced hourly with a grid resolution of 1km*1km on IIoTC (Industrial Internet of Things Cloud) and saved in netCDF4 format. The elaborated procedures to generate forecasts comprise data recalibrating, outlier elimination, Kriging Interpolation and particle tracking and random walk techniques for the mechanisms of diffusion and advection. The solution of these equations reveals the causality between factories emission and the associated air pollution. Further, with the aid of installed real-time flue emission (Total Suspension Emission, TSP) sensors and the mentioned forecasted air pollution map, this study also disclosed the converting mechanism between the TSP and PM2.5/PM10 for different region and industrial characteristics, according to the long-term data observation and calibration. These different time-series qualitative and quantitative data which successfully achieved a causal inference engine in cloud for factory management control in practicable. Once the forecasted air quality for a region is marked as harmful, the correlated factories are notified and asked to suppress its operation and reduces emission in advance.

Keywords: continuous emission monitoring system, total suspension particulates, causal inference, air pollution forecast, IoT

Procedia PDF Downloads 79
26653 Computation of Flood and Drought Years over the North-West Himalayan Region Using Indian Meteorological Department Rainfall Data

Authors: Sudip Kumar Kundu, Charu Singh

Abstract:

The climatic condition over Indian region is highly dependent on monsoon. India receives maximum amount of rainfall during southwest monsoon. Indian economy is highly dependent on agriculture. The presence of flood and drought years influenced the total cultivation system as well as the economy of the country as Indian agricultural systems is still highly dependent on the monsoon rainfall. The present study has been planned to investigate the flood and drought years for the north-west Himalayan region from 1951 to 2014 by using area average Indian Meteorological Department (IMD) rainfall data. For this investigation the Normalized index (NI) has been utilized to find out whether the particular year is drought or flood. The data have been extracted for the north-west Himalayan (NWH) region states namely Uttarakhand (UK), Himachal Pradesh (HP) and Jammu and Kashmir (J&K) to find out the rainy season average rainfall for each year, climatological mean and the standard deviation. After calculation it has been plotted by the diagrams (or graphs) to show the results- some of the years associated with drought years, some are flood years and rest are neutral. The flood and drought years can also relate with the large-scale phenomena El-Nino and La-Lina.

Keywords: IMD, rainfall, normalized index, flood, drought, NWH

Procedia PDF Downloads 285
26652 A New Intelligent, Dynamic and Real Time Management System of Sewerage

Authors: R. Tlili Yaakoubi, H.Nakouri, O. Blanpain, S. Lallahem

Abstract:

The current tools for real time management of sewer systems are based on two software tools: the software of weather forecast and the software of hydraulic simulation. The use of the first ones is an important cause of imprecision and uncertainty, the use of the second requires temporal important steps of decision because of their need in times of calculation. This way of proceeding fact that the obtained results are generally different from those waited. The major idea of this project is to change the basic paradigm by approaching the problem by the "automatic" face rather than by that "hydrology". The objective is to make possible the realization of a large number of simulations at very short times (a few seconds) allowing to take place weather forecasts by using directly the real time meditative pluviometric data. The aim is to reach a system where the decision-making is realized from reliable data and where the correction of the error is permanent. A first model of control laws was realized and tested with different return-period rainfalls. The gains obtained in rejecting volume vary from 19 to 100 %. The development of a new algorithm was then used to optimize calculation time and thus to overcome the subsequent combinatorial problem in our first approach. Finally, this new algorithm was tested with 16- year-rainfall series. The obtained gains are 40 % of total volume rejected to the natural environment and of 65 % in the number of discharges.

Keywords: automation, optimization, paradigm, RTC

Procedia PDF Downloads 293
26651 A Demonstration of How to Employ and Interpret Binary IRT Models Using the New IRT Procedure in SAS 9.4

Authors: Ryan A. Black, Stacey A. McCaffrey

Abstract:

Over the past few decades, great strides have been made towards improving the science in the measurement of psychological constructs. Item Response Theory (IRT) has been the foundation upon which statistical models have been derived to increase both precision and accuracy in psychological measurement. These models are now being used widely to develop and refine tests intended to measure an individual's level of academic achievement, aptitude, and intelligence. Recently, the field of clinical psychology has adopted IRT models to measure psychopathological phenomena such as depression, anxiety, and addiction. Because advances in IRT measurement models are being made so rapidly across various fields, it has become quite challenging for psychologists and other behavioral scientists to keep abreast of the most recent developments, much less learn how to employ and decide which models are the most appropriate to use in their line of work. In the same vein, IRT measurement models vary greatly in complexity in several interrelated ways including but not limited to the number of item-specific parameters estimated in a given model, the function which links the expected response and the predictor, response option formats, as well as dimensionality. As a result, inferior methods (a.k.a. Classical Test Theory methods) continue to be employed in efforts to measure psychological constructs, despite evidence showing that IRT methods yield more precise and accurate measurement. To increase the use of IRT methods, this study endeavors to provide a comprehensive overview of binary IRT models; that is, measurement models employed on test data consisting of binary response options (e.g., correct/incorrect, true/false, agree/disagree). Specifically, this study will cover the most basic binary IRT model, known as the 1-parameter logistic (1-PL) model dating back to over 50 years ago, up until the most recent complex, 4-parameter logistic (4-PL) model. Binary IRT models will be defined mathematically and the interpretation of each parameter will be provided. Next, all four binary IRT models will be employed on two sets of data: 1. Simulated data of N=500,000 subjects who responded to four dichotomous items and 2. A pilot analysis of real-world data collected from a sample of approximately 770 subjects who responded to four self-report dichotomous items pertaining to emotional consequences to alcohol use. Real-world data were based on responses collected on items administered to subjects as part of a scale-development study (NIDA Grant No. R44 DA023322). IRT analyses conducted on both the simulated data and analyses of real-world pilot will provide a clear demonstration of how to construct, evaluate, and compare binary IRT measurement models. All analyses will be performed using the new IRT procedure in SAS 9.4. SAS code to generate simulated data and analyses will be available upon request to allow for replication of results.

Keywords: instrument development, item response theory, latent trait theory, psychometrics

Procedia PDF Downloads 346
26650 Lessons from Nature: Defensive Designs for the Built Environment

Authors: Rebecca A. Deek

Abstract:

There is evidence that erratic and extreme weather is becoming a common occurrence, and even predictions that this will become even more frequent and more severe. It also appears that the severity of earthquakes is intensifying. Some observers believe that human conduct has given reasons for such change; others attribute this to environmental and geological cycles. However, as some physicists, environmental scientists, politicians, and others continue to debate the connection between weather events, seismic activities, and climate change, other scientists, engineers, and urban planners are exploring how can our habitat become more responsive and resilient to such phenomena. There are a number of recent instances of nature’s destructive events that provide basis for the development of defensive measures.

Keywords: biomimicry, natural disasters, protection of human lives, resilient infrastructures

Procedia PDF Downloads 499
26649 Estimates of Freshwater Content from ICESat-2 Derived Dynamic Ocean Topography

Authors: Adan Valdez, Shawn Gallaher, James Morison, Jordan Aragon

Abstract:

Global climate change has impacted atmospheric temperatures contributing to rising sea levels, decreasing sea ice, and increased freshening of high latitude oceans. This freshening has contributed to increased stratification inhibiting local mixing and nutrient transport and modifying regional circulations in polar oceans. In recent years, the Western Arctic has seen an increase in freshwater volume at an average rate of 397+-116 km3/year. The majority of the freshwater volume resides in the Beaufort Gyre surface lens driven by anticyclonic wind forcing, sea ice melt, and Arctic river runoff. The total climatological freshwater content is typically defined as water fresher than 34.8. The near-isothermal nature of Arctic seawater and non-linearities in the equation of state for near-freezing waters result in a salinity driven pycnocline as opposed to the temperature driven density structure seen in the lower latitudes. In this study, we investigate the relationship between freshwater content and remotely sensed dynamic ocean topography (DOT). In-situ measurements of freshwater content are useful in providing information on the freshening rate of the Beaufort Gyre; however, their collection is costly and time consuming. NASA’s Advanced Topographic Laser Altimeter System (ATLAS) derived dynamic ocean topography (DOT), and Air Expendable CTD (AXCTD) derived Freshwater Content are used to develop a linear regression model. In-situ data for the regression model is collected across the 150° West meridian, which typically defines the centerline of the Beaufort Gyre. Two freshwater content models are determined by integrating the freshwater volume between the surface and an isopycnal corresponding to reference salinities of 28.7 and 34.8. These salinities correspond to those of the winter pycnocline and total climatological freshwater content, respectively. Using each model, we determine the strength of the linear relationship between freshwater content and satellite derived DOT. The result of this modeling study could provide a future predictive capability of freshwater volume changes in the Beaufort-Chukchi Sea using non in-situ methods. Successful employment of the ICESat-2’s DOT approximation of freshwater content could potentially reduce reliance on field deployment platforms to characterize physical ocean properties.

Keywords: ICESat-2, dynamic ocean topography, freshwater content, beaufort gyre

Procedia PDF Downloads 74
26648 A Wireless Sensor System for Continuous Monitoring of Particulate Air Pollution

Authors: A. Yawootti, P. Intra, P. Sardyoung, P. Phoosomma, R. Puttipattanasak, S. Leeragreephol, N. Tippayawong

Abstract:

The aim of this work is to design, develop and test the low-cost implementation of a particulate air pollution sensor system for continuous monitoring of outdoors and indoors particulate air pollution at a lower cost than existing instruments. In this study, measuring electrostatic charge of particles technique via high efficiency particulate-free air filter was carried out. The developed detector consists of a PM10 impactor, a particle charger, a Faraday cup electrometer, a flow meter and controller, a vacuum pump, a DC high voltage power supply and a data processing and control unit. It was reported that the developed detector was capable of measuring mass concentration of particulate ranging from 0 to 500 µg/m3 corresponding to number concentration of particulate ranging from 106 to 1012 particles/m3 with measurement time less than 1 sec. The measurement data of the sensor connects to the internet through a GSM connection to a public cellular network. In this development, the apparatus was applied the energy by a 12 V, 7 A internal battery for continuous measurement of about 20 hours. Finally, the developed apparatus was found to be close agreement with the import standard instrument, portable and benefit for air pollution and particulate matter measurements.

Keywords: particulate, air pollution, wireless communication, sensor

Procedia PDF Downloads 360
26647 Measurement of Asphalt Pavement Temperature to Find out the Proper Asphalt Binder Performance Grade to the Asphalt Mixtures in Southern Desert of Libya

Authors: Khlifa El Atrash, Gabriel Assaf

Abstract:

Most developing countries use volumetric analysis in designing asphalt mixtures, which can also be upgraded in hot arid weather. However, in order to be effective, it should include many important aspects which are materials, environment, and method of construction. The overall intent of the work reported in this study is to test different asphalt mixtures while taking into consideration the environment, type and source of material, tools, equipment, and the construction method. In this study, several tests were conducted on many samples that were carefully prepared under the expected traffic loads and temperatures in a dry hot climate. Several asphalt concrete mixtures were designed using two different binders. These mixtures were analyzed under two types of tests - Complex Modulus and Rutting test - to evaluate the hot mix asphalt properties under the represented temperatures and traffic load in Libya. These factors play an important role to improve the pavement performances in a hot climate weather based on the properties of the asphalt mixture, climate, and traffic load. This research summarized some recommendations for making asphalt mixtures used in hot dry areas. Such asphalt mixtures should use asphalt binder which is less affected by pavement temperature change and traffic load. The properties of the mixture, such as durability, deformation, air voids and performance, largely depend on the type of materials, environment, and mixing method. These properties, in turn, affect the pavement performance. Therefore, this study is aimed to develop a method for designing an asphalt mixture that takes into account field loading, various stresses, and temperature spectrums.

Keywords: volumetric analysis, pavement performances, hot climate, asphalt mixture, traffic load

Procedia PDF Downloads 302
26646 Conception of a Regulated, Dynamic and Intelligent Sewerage in Ostrevent

Authors: Rabaa Tlili Yaakoubi, Hind Nakouri, Olivier Blanpain

Abstract:

The current tools for real time management of sewer systems are based on two software tools: the software of weather forecast and the software of hydraulic simulation. The use of the first ones is an important cause of imprecision and uncertainty, the use of the second requires temporal important steps of decision because of their need in times of calculation. This way of proceeding fact that the obtained results are generally different from those waited. The major idea of the CARDIO project is to change the basic paradigm by approaching the problem by the "automatic" face rather than by that "hydrology". The objective is to make possible the realization of a large number of simulations at very short times (a few seconds) allowing to take place weather forecasts by using directly the real time meditative pluviometric data. The aim is to reach a system where the decision-making is realized from reliable data and where the correction of the error is permanent. A first model of control laws was realized and tested with different return-period rainfalls. The gains obtained in rejecting volume vary from 40 to 100%. The development of a new algorithm was then used to optimize calculation time and thus to overcome the subsequent combinatorial problem in our first approach. Finally, this new algorithm was tested with 16- year-rainfall series. The obtained gains are 60% of total volume rejected to the natural environment and of 80 % in the number of discharges.

Keywords: RTC, paradigm, optimization, automation

Procedia PDF Downloads 279