Search results for: accuracy improvement
374 Improvement of Oxidative Stability of Edible Oil by Microencapsulation Using Plant Proteins
Authors: L. Le Priol, A. Nesterenko, K. El Kirat, K. Saleh
Abstract:
Introduction and objectives: Polyunsaturated fatty acids (PUFAs) omega-3 and omega-6 are widely recognized as being beneficial to the health and normal growth. Unfortunately, due to their highly unsaturated nature, these molecules are sensitive to oxidation and thermic degradation leading to the production of toxic compounds and unpleasant flavors and smells. Hence, it is necessary to find out a suitable way to protect them. Microencapsulation by spray-drying is a low-cost encapsulation technology and most commonly used in the food industry. Many compounds can be used as wall materials, but there is a growing interest in the use of biopolymers, such as proteins and polysaccharides, over the last years. The objective of this study is to increase the oxidative stability of sunflower oil by microencapsulation in plant protein matrices using spray-drying technique. Material and methods: Sunflower oil was used as a model substance for oxidable food oils. Proteins from brown rice, hemp, pea, soy and sunflower seeds were used as emulsifiers and microencapsulation wall materials. First, the proteins were solubilized in distilled water. Then, the emulsions were pre-homogenized using a high-speed homogenizer (Ultra-Turrax) and stabilized by using a high-pressure homogenizer (HHP). Drying of the emulsion was performed in a Mini Spray Dryer. The oxidative stability of the encapsulated oil was determined by performing accelerated oxidation tests with a Rancimat. The size of the microparticles was measured using a laser diffraction analyzer. The morphology of the spray-dried microparticles was acquired using environmental scanning microscopy. Results: Pure sunflower oil was used as a reference material. Its induction time was 9.5 ± 0.1 h. The microencapsulation of sunflower oil in pea and soy protein matrices significantly improved its oxidative stability with induction times of 21.3 ± 0.4 h and 12.5 ± 0.4 h respectively. The encapsulation with hemp proteins did not significantly change the oxidative stability of the encapsulated oil. Sunflower and brown rice proteins were ineffective materials for this application, with induction times of 7.2 ± 0.2 h and 7.0 ± 0.1 h respectively. The volume mean diameter of the microparticles formulated with soy and pea proteins were 8.9 ± 0.1 µm and 16.3 ± 1.2 µm respectively. The values for hemp, sunflower and brown rice proteins could not be obtained due to the agglomeration of the microparticles. ESEM images showed smooth and round microparticles with soy and pea proteins. The surfaces of the microparticles obtained with sunflower and hemp proteins were porous. The surface was rough when brown rice proteins were used as the encapsulating agent. Conclusion: Soy and pea proteins appeared to be efficient wall materials for the microencapsulation of sunflower oil by spray drying. These results were partly explained by the higher solubility of soy and pea proteins in water compared to hemp, sunflower, and brown rice proteins. Acknowledgment: This work has been performed, in partnership with the SAS PIVERT, within the frame of the French Institute for the Energy Transition (Institut pour la Transition Energétique (ITE)) P.I.V.E.R.T. (www.institut-pivert.com) selected as an Investments for the Future (Investissements d’Avenir). This work was supported, as part of the Investments for the Future, by the French Government under the reference ANR-001-01.Keywords: biopolymer, edible oil, microencapsulation, oxidative stability, release, spray-drying
Procedia PDF Downloads 137373 Investigations on the Application of Avalanche Simulations: A Survey Conducted among Avalanche Experts
Authors: Korbinian Schmidtner, Rudolf Sailer, Perry Bartelt, Wolfgang Fellin, Jan-Thomas Fischer, Matthias Granig
Abstract:
This study focuses on the evaluation of snow avalanche simulations, based on a survey that has been carried out among avalanche experts. In the last decades, the application of avalanche simulation tools has gained recognition within the realm of hazard management. Traditionally, avalanche runout models were used to predict extreme avalanche runout and prepare avalanche maps. This has changed rather dramatically with the application of numerical models. For safety regulations such as road safety simulation tools are now being coupled with real-time meteorological measurements to predict frequent avalanche hazard. That places new demands on model accuracy and requires the simulation of physical processes that previously could be ignored. These simulation tools are based on a deterministic description of the avalanche movement allowing to predict certain quantities (e.g. pressure, velocities, flow heights, runout lengths etc.) of the avalanche flow. Because of the highly variable regimes of the flowing snow, no uniform rheological law describing the motion of an avalanche is known. Therefore, analogies to fluid dynamical laws of other materials are stated. To transfer these constitutional laws to snow flows, certain assumptions and adjustments have to be imposed. Besides these limitations, there exist high uncertainties regarding the initial and boundary conditions. Further challenges arise when implementing the underlying flow model equations into an algorithm executable by a computer. This implementation is constrained by the choice of adequate numerical methods and their computational feasibility. Hence, the model development is compelled to introduce further simplifications and the related uncertainties. In the light of these issues many questions arise on avalanche simulations, on their assets and drawbacks, on potentials for improvements as well as their application in practice. To address these questions a survey among experts in the field of avalanche science (e.g. researchers, practitioners, engineers) from various countries has been conducted. In the questionnaire, special attention is drawn on the expert’s opinion regarding the influence of certain variables on the simulation result, their uncertainty and the reliability of the results. Furthermore, it was tested to which degree a simulation result influences the decision making for a hazard assessment. A discrepancy could be found between a large uncertainty of the simulation input parameters as compared to a relatively high reliability of the results. This contradiction can be explained taking into account how the experts employ the simulations. The credibility of the simulations is the result of a rather thoroughly simulation study, where different assumptions are tested, comparing the results of different flow models along with the use of supplemental data such as chronicles, field observation, silent witnesses i.a. which are regarded as essential for the hazard assessment and for sanctioning simulation results. As the importance of avalanche simulations grows within the hazard management along with their further development studies focusing on the modeling fashion could contribute to a better understanding how knowledge of the avalanche process can be gained by running simulations.Keywords: expert interview, hazard management, modeling, simulation, snow avalanche
Procedia PDF Downloads 327372 Thermodynamics of Aqueous Solutions of Organic Molecule and Electrolyte: Use Cloud Point to Obtain Better Estimates of Thermodynamic Parameters
Authors: Jyoti Sahu, Vinay A. Juvekar
Abstract:
Electrolytes are often used to bring about salting-in and salting-out of organic molecules and polymers (e.g. polyethylene glycols/proteins) from the aqueous solutions. For quantification of these phenomena, a thermodynamic model which can accurately predict activity coefficient of electrolyte as a function of temperature is needed. The thermodynamics models available in the literature contain a large number of empirical parameters. These parameters are estimated using lower/upper critical solution temperature of the solution in the electrolyte/organic molecule at different temperatures. Since the number of parameters is large, inaccuracy can bethe creep in during their estimation, which can affect the reliability of prediction beyond the range in which these parameters are estimated. Cloud point of solution is related to its free energy through temperature and composition derivative. Hence, the Cloud point measurement can be used for accurate estimation of the temperature and composition dependence of parameters in the model for free energy. Hence, if we use a two pronged procedure in which we first use cloud point of solution to estimate some of the parameters of the thermodynamic model and determine the rest using osmotic coefficient data, we gain on two counts. First, since the parameters, estimated in each of the two steps, are fewer, we achieve higher accuracy of estimation. The second and more important gain is that the resulting model parameters are more sensitive to temperature. This is crucial when we wish to use the model outside temperatures window within which the parameter estimation is sought. The focus of the present work is to prove this proposition. We have used electrolyte (NaCl/Na2CO3)-water-organic molecule (Iso-propanol/ethanol) as the model system. The model of Robinson-Stokes-Glukauf is modified by incorporating the temperature dependent Flory-Huggins interaction parameters. The Helmholtz free energy expression contains, in addition to electrostatic and translational entropic contributions, three Flory-Huggins pairwise interaction contributions viz., and (w-water, p-polymer, s-salt). These parameters depend both on temperature and concentrations. The concentration dependence is expressed in the form of a quadratic expression involving the volume fractions of the interacting species. The temperature dependence is expressed in the form .To obtain the temperature-dependent interaction parameters for organic molecule-water and electrolyte-water systems, Critical solution temperature of electrolyte -water-organic molecules is measured using cloud point measuring apparatus The temperature and composition dependent interaction parameters for electrolyte-water-organic molecule are estimated through measurement of cloud point of solution. The model is used to estimate critical solution temperature (CST) of electrolyte water-organic molecules solution. We have experimentally determined the critical solution temperature of different compositions of electrolyte-water-organic molecule solution and compared the results with the estimates based on our model. The two sets of values show good agreement. On the other hand when only osmotic coefficients are used for estimation of the free energy model, CST predicted using the resulting model show poor agreement with the experiments. Thus, the importance of the CST data in the estimation of parameters of the thermodynamic model is confirmed through this work.Keywords: concentrated electrolytes, Debye-Hückel theory, interaction parameters, Robinson-Stokes-Glueckauf model, Flory-Huggins model, critical solution temperature
Procedia PDF Downloads 393371 Medical Dressing Induced Digital Ischemia in Patient with Congenital Insensitivity to Pain and Anhidrosis
Authors: Abdulwhab Alotaibi, Abdullah Alzahrani, Ziyad Bokhari, Abdulelah Alghamdi
Abstract:
First described in 1975 by Dr. Miller, Medical dressings are uncommon but possible cause of hand digital ischemia due the tourniquet-like effect. The incident of this complication has been reported across wide range of age-groups, yet it seems like that the pediatric population are specifically vulnerable. Multiple dressing types were reported to have caused ischemic injury, such as elastic wrap, tubular gauze, and self-adherent dressings. We present a case of medical dressing induced digital ischemia in patient with Congenital insensitivity to pain and anhidrosis (CIPA), which further challenge the discovery of the condition. An 8-year-old girl known case of CIPA. Brought by her mother to the ER after nail bed injury, which she managed by application of elastic wrap that was left for 24 hours. When the mother found out she immediately removed the elastic band, and noticed the fingertip was black and cold with tense bullae. The color then changed later when she arrived to the ER to dark purple with bluish discoloration on the tip. On examination there was well demarcated tense bullae on the distal right fifth finger. Neurovascular intact, pulse oximetry on distal digit 100%, capillary refill time was delayed. She was seen under Plastic surgery and conservative management recommended, and patient was discharged with safety netting. Two days later the patient came as follow-up visit at which her condition demonstrated significant improvement, the bullae has since ruptured leaving behind sloughed skin, capillary refill and pulse oximetry were both within normal limits, sensory function couldn’t be assessed but her motor function and ROM were normal, topical bacitracin and bandage dressings were applied for the eroded skin. Patient was scheduled for a follow-up in 2 weeks. Preventatively it’s advisable to avoid the commonly implicated dressings such as elastic, tubular gauze or self-adherent wraps in hand or digital injuries when possible, but in cases where the use of these dressings is of necessity the appropriate precautions must be taken, Dr. Makarewich proposed the following 5 measures to help minimize the incidence of the injury: 1-Unwrapping 12 inches of the dressing before rolling the injured finger. 2-Wrapping from distal to proximal with minimal tension to avoid vascular embarrassment. 3-The use of 5-25 inch to overlap the entire wrap. 4-Maintaining light pressure over the wrap to allow adherence of the dressing. 5-Minimization of the number of layers used to wrap the affected digit. Also assessing the capillary refill after the application can help in determining the patency of the supplying blood vessels. It’s also important to selectively determine if the patient is a candidate for conservative management, as tailored approach can help in maximizing the positive outcomes for our patients.Keywords: congenital insensitivity to pain, digital ischemia, medical dressing, conservative management
Procedia PDF Downloads 64370 Analysis of the Evolution of the Behavior of Land Users Linked to the Surge in the Prices of Cash Crops: Case of the Northeast Region of Madagascar
Authors: Zo Hasina Rabemananjara
Abstract:
The North-East of Madagascar is the pillar of Madagascar's foreign trade, providing 41% and 80% of world exports of cloves and vanilla, respectively, in 2016. For Madagascar, the north-eastern escarpment is home to the last massifs of humid forest in large scale of the island, surrounded by a small scale agricultural mosaic. In the sites where this study is taking place, located in the peripheral zones of protected areas, the production of rent aims to supply international markets. In fact, importers of the cash crops produced in these areas are located mainly in India, Singapore, France, Germany and the United States. Recently, the price of these products has increased significantly, especially from the year 2015. For vanilla, the price has skyrocketed, from an approximate price of 73 USD per kilo in 2015 to more than 250 USD per kilo in 2016. The value of clove exports increased sharply by 49.4% in 2017, largely to Singapore and India due to the sharp increase in exported volume (+47, 6%) in 2017. If the relationship between the rise in prices of rented products and the change in physical environments is known, the evolution of the behavior of land users linked to this aspect was not yet addressed by research. In fact, the consequence of this price increase in the organization of the use of space at the local level still raises questions. Hence, the research question is: to what extent does this improvement in the price of imported products affect user behavior linked to the local organization of access to the factor of soil production? To fully appreciate this change in behavior, surveys of 144 land user households were carried out, and group interviews were also carried out. The results of this research showed that the rise in the prices of annuity products from the year 2015 caused significant changes in the behavior of land users in the study sites. Young people, who have not been attracted to farming for a long time, have started to show interest in it since the period of rising vanilla and clove prices. They have set up their own fields of vanilla and clove cultivation. This revival of interest conferred an important value on the land and caused conflicts especially between family members because the acquisition of the cultivated land was done by inheritance or donation. This change in user behavior has also affected the farmers' life strategy since the latter have decided to abandon rain-fed rice farming, which has long been considered a guaranteed subsistence activity for cash crops. This research will contribute to nourishing scientific reflection on the management of land use and also to support political decision-makers in decision-making on spatial planning.Keywords: behavior of land users, North-eastern Madagascar, price of export products, spatial planning
Procedia PDF Downloads 117369 A 4-Month Low-carb Nutrition Intervention Study Aimed to Demonstrate the Significance of Addressing Insulin Resistance in 2 Subjects with Type-2 Diabetes for Better Management
Authors: Shashikant Iyengar, Jasmeet Kaur, Anup Singh, Arun Kumar, Ira Sahay
Abstract:
Insulin resistance (IR) is a condition that occurs when cells in the body become less responsive to insulin, leading to higher levels of both insulin and glucose in the blood. This condition is linked to metabolic syndromes, including diabetes. It is crucial to address IR promptly after diagnosis to prevent long-term complications associated with high insulin and high blood glucose. This four-month case study highlights the importance of treating the underlying condition to manage diabetes effectively. Insulin is essential for regulating blood sugar levels by facilitating the uptake of glucose into cells for energy or storage. In IR individuals, cells are less efficient at taking up glucose from the blood resulting in elevated blood glucose levels. As a result of IR, beta cells produce more insulin to make up for the body's inability to use insulin effectively. This leads to high insulin levels, a condition known as hyperinsulinemia, which further impairs glucose metabolism and can contribute to various chronic diseases. In addition to regulating blood glucose, insulin has anti-catabolic effects, preventing the breakdown of molecules in the body, such as inhibiting glycogen breakdown in the liver, inhibiting gluconeogenesis, and inhibiting lipolysis. If a person is insulin-sensitive or metabolically healthy, an optimal level of insulin prevents fat cells from releasing fat and promotes the storage of glucose and fat in the body. Thus optimal insulin levels are crucial for maintaining energy balance and plays a key role in metabolic processes. During the four-month study, researchers looked at the impact of a low-carb dietary (LCD) intervention on two male individuals (A & B) who had Type-2 diabetes. Althoughvneither of these individuals were obese, they were both slightly overweight and had abdominal fat deposits. Before the trial began, important markers such as fasting blood glucose (FBG), triglycerides (TG), high-density lipoprotein (HDL) cholesterol, and Hba1c were measured. These markers are essential in defining metabolic health, their individual values and variability are integral in deciphering metabolic health. The ratio of TG to HDL is used as a surrogate marker for IR. This ratio has a high correlation with the prevalence of metabolic syndrome and with IR itself. It is a convenient measure because it can be calculated from a standard lipid profile and does not require more complex tests. In this four-month trial, an improvement in insulin sensitivity was observed through the ratio of TG/HDL, which, in turn, improves fasting blood glucose levels and HbA1c. For subject A, HbA1c dropped from 13 to 6.28, and for subject B, it dropped from 9.4 to 5.7. During the trial, neither of the subjects were taking any diabetic medications. The significant improvements in their health markers, such as better glucose control, along with an increase in energy levels, demonstrate that incorporating LCD interventions can effectively manage diabetes.Keywords: metabolic disorder, insulin resistance, type-2 diabetes, low-carb nutrition
Procedia PDF Downloads 51368 Using Group Concept Mapping to Identify a Pharmacy-Based Trigger Tool to Detect Adverse Drug Events
Authors: Rodchares Hanrinth, Theerapong Srisil, Peeraya Sriphong, Pawich Paktipat
Abstract:
The trigger tool is the low-cost, low-tech method to detect adverse events through clues called triggers. The Institute for Healthcare Improvement (IHI) has developed the Global Trigger Tool for measuring and preventing adverse events. However, this tool is not specific for detecting adverse drug events. The pharmacy-based trigger tool is needed to detect adverse drug events (ADEs). Group concept mapping is an effective method for conceptualizing various ideas from diverse stakeholders. This technique was used to identify a pharmacy-based trigger to detect adverse drug events (ADEs). The aim of this study was to involve the pharmacists in conceptualizing, developing, and prioritizing a feasible trigger tool to detect adverse drug events in a provincial hospital, the northeastern part of Thailand. The study was conducted during the 6-month period between April 1 and September 30, 2017. Study participants involved 20 pharmacists (17 hospital pharmacists and 3 pharmacy lecturers) engaging in three concept mapping workshops. In this meeting, the concept mapping technique created by Trochim, a highly constructed qualitative group technic for idea generating and sharing, was used to produce and construct participants' views on what triggers were potential to detect ADEs. During the workshops, participants (n = 20) were asked to individually rate the feasibility and potentiality of each trigger and to group them into relevant categories to enable multidimensional scaling and hierarchical cluster analysis. The outputs of analysis included the trigger list, cluster list, point map, point rating map, cluster map, and cluster rating map. The three workshops together resulted in 21 different triggers that were structured in a framework forming 5 clusters: drug allergy, drugs induced diseases, dosage adjustment in renal diseases, potassium concerning, and drug overdose. The first cluster is drug allergy such as the doctor’s orders for dexamethasone injection combined with chlorpheniramine injection. Later, the diagnosis of drug-induced hepatitis in a patient taking anti-tuberculosis drugs is one trigger in the ‘drugs induced diseases’ cluster. Then, for the third cluster, the doctor’s orders for enalapril combined with ibuprofen in a patient with chronic kidney disease is the example of a trigger. The doctor’s orders for digoxin in a patient with hypokalemia is a trigger in a cluster. Finally, the doctor’s orders for naloxone with narcotic overdose was classified as a trigger in a cluster. This study generated triggers that are similar to some of IHI Global trigger tool, especially in the medication module such as drug allergy and drug overdose. However, there are some specific aspects of this tool, including drug-induced diseases, dosage adjustment in renal diseases, and potassium concerning which do not contain in any trigger tools. The pharmacy-based trigger tool is suitable for pharmacists in hospitals to detect potential adverse drug events using clues of triggers.Keywords: adverse drug events, concept mapping, hospital, pharmacy-based trigger tool
Procedia PDF Downloads 167367 Landslide Hazard Assessment Using Physically Based Mathematical Models in Agricultural Terraces at Douro Valley in North of Portugal
Authors: C. Bateira, J. Fernandes, A. Costa
Abstract:
The Douro Demarked Region (DDR) is a production Porto wine region. On the NE of Portugal, the strong incision of the Douro valley developed very steep slopes, organized with agriculture terraces, have experienced an intense and deep transformation in order to implement the mechanization of the work. The old terrace system, based on stone vertical wall support structure, replaced by terraces with earth embankments experienced a huge terrace instability. This terrace instability has important economic and financial consequences on the agriculture enterprises. This paper presents and develops cartographic tools to access the embankment instability and identify the area prone to instability. The priority on this evaluation is related to the use of physically based mathematical models and develop a validation process based on an inventory of the past embankment instability. We used the shallow landslide stability model (SHALSTAB) based on physical parameters such us cohesion (c’), friction angle(ф), hydraulic conductivity, soil depth, soil specific weight (ϱ), slope angle (α) and contributing areas by Multiple Flow Direction Method (MFD). A terraced area can be analysed by this models unless we have very detailed information representative of the terrain morphology. The slope angle and the contributing areas depend on that. We can achieve that propose using digital elevation models (DEM) with great resolution (pixel with 40cm side), resulting from a set of photographs taken by a flight at 100m high with pixel resolution of 12cm. The slope angle results from this DEM. In the other hand, the MFD contributing area models the internal flow and is an important element to define the spatial variation of the soil saturation. That internal flow is based on the DEM. That is supported by the statement that the interflow, although not coincident with the superficial flow, have important similitude with it. Electrical resistivity monitoring values which related with the MFD contributing areas build from a DEM of 1m resolution and revealed a consistent correlation. That analysis, performed on the area, showed a good correlation with R2 of 0,72 and 0,76 at 1,5m and 2m depth, respectively. Considering that, a DEM with 1m resolution was the base to model the real internal flow. Thus, we assumed that the contributing area of 1m resolution modelled by MFD is representative of the internal flow of the area. In order to solve this problem we used a set of generalized DEMs to build the contributing areas used in the SHALSTAB. Those DEMs, with several resolutions (1m and 5m), were built from a set of photographs with 50cm resolution taken by a flight with 5km high. Using this maps combination, we modelled several final maps of terrace instability and performed a validation process with the contingency matrix. The best final instability map resembles the slope map from a DEM of 40cm resolution and a MFD map from a DEM of 1m resolution with a True Positive Rate (TPR) of 0,97, a False Positive Rate of 0,47, Accuracy (ACC) of 0,53, Precision (PVC) of 0,0004 and a TPR/FPR ratio of 2,06.Keywords: agricultural terraces, cartography, landslides, SHALSTAB, vineyards
Procedia PDF Downloads 178366 Optimizing Stormwater Sampling Design for Estimation of Pollutant Loads
Authors: Raja Umer Sajjad, Chang Hee Lee
Abstract:
Stormwater runoff is the leading contributor to pollution of receiving waters. In response, an efficient stormwater monitoring program is required to quantify and eventually reduce stormwater pollution. The overall goals of stormwater monitoring programs primarily include the identification of high-risk dischargers and the development of total maximum daily loads (TMDLs). The challenge in developing better monitoring program is to reduce the variability in flux estimates due to sampling errors; however, the success of monitoring program mainly depends on the accuracy of the estimates. Apart from sampling errors, manpower and budgetary constraints also influence the quality of the estimates. This study attempted to develop optimum stormwater monitoring design considering both cost and the quality of the estimated pollutants flux. Three years stormwater monitoring data (2012 – 2014) from a mix land use located within Geumhak watershed South Korea was evaluated. The regional climate is humid and precipitation is usually well distributed through the year. The investigation of a large number of water quality parameters is time-consuming and resource intensive. In order to identify a suite of easy-to-measure parameters to act as a surrogate, Principal Component Analysis (PCA) was applied. Means, standard deviations, coefficient of variation (CV) and other simple statistics were performed using multivariate statistical analysis software SPSS 22.0. The implication of sampling time on monitoring results, number of samples required during the storm event and impact of seasonal first flush were also identified. Based on the observations derived from the PCA biplot and the correlation matrix, total suspended solids (TSS) was identified as a potential surrogate for turbidity, total phosphorus and for heavy metals like lead, chromium, and copper whereas, Chemical Oxygen Demand (COD) was identified as surrogate for organic matter. The CV among different monitored water quality parameters were found higher (ranged from 3.8 to 15.5). It suggests that use of grab sampling design to estimate the mass emission rates in the study area can lead to errors due to large variability. TSS discharge load calculation error was found only 2 % with two different sample size approaches; i.e. 17 samples per storm event and equally distributed 6 samples per storm event. Both seasonal first flush and event first flush phenomena for most water quality parameters were observed in the study area. Samples taken at the initial stage of storm event generally overestimate the mass emissions; however, it was found that collecting a grab sample after initial hour of storm event more closely approximates the mean concentration of the event. It was concluded that site and regional climate specific interventions can be made to optimize the stormwater monitoring program in order to make it more effective and economical.Keywords: first flush, pollutant load, stormwater monitoring, surrogate parameters
Procedia PDF Downloads 241365 High Speed Motion Tracking with Magnetometer in Nonuniform Magnetic Field
Authors: Jeronimo Cox, Tomonari Furukawa
Abstract:
Magnetometers have become more popular in inertial measurement units (IMU) for their ability to correct estimations using the earth's magnetic field. Accelerometer and gyroscope-based packages fail with dead-reckoning errors accumulated over time. Localization in robotic applications with magnetometer-inclusive IMUs has become popular as a way to track the odometry of slower-speed robots. With high-speed motions, the accumulated error increases over smaller periods of time, making them difficult to track with IMU. Tracking a high-speed motion is especially difficult with limited observability. Visual obstruction of motion leaves motion-tracking cameras unusable. When motions are too dynamic for estimation techniques reliant on the observability of the gravity vector, the use of magnetometers is further justified. As available magnetometer calibration methods are limited with the assumption that background magnetic fields are uniform, estimation in nonuniform magnetic fields is problematic. Hard iron distortion is a distortion of the magnetic field by other objects that produce magnetic fields. This kind of distortion is often observed as the offset from the origin of the center of data points when a magnetometer is rotated. The magnitude of hard iron distortion is dependent on proximity to distortion sources. Soft iron distortion is more related to the scaling of the axes of magnetometer sensors. Hard iron distortion is more of a contributor to the error of attitude estimation with magnetometers. Indoor environments or spaces inside ferrite-based structures, such as building reinforcements or a vehicle, often cause distortions with proximity. As positions correlate to areas of distortion, methods of magnetometer localization include the production of spatial mapping of magnetic field and collection of distortion signatures to better aid location tracking. The goal of this paper is to compare magnetometer methods that don't need pre-productions of magnetic field maps. Mapping the magnetic field in some spaces can be costly and inefficient. Dynamic measurement fusion is used to track the motion of a multi-link system with us. Conventional calibration by data collection of rotation at a static point, real-time estimation of calibration parameters each time step, and using two magnetometers for determining local hard iron distortion are compared to confirm the robustness and accuracy of each technique. With opposite-facing magnetometers, hard iron distortion can be accounted for regardless of position, Rather than assuming that hard iron distortion is constant regardless of positional change. The motion measured is a repeatable planar motion of a two-link system connected by revolute joints. The links are translated on a moving base to impulse rotation of the links. Equipping the joints with absolute encoders and recording the motion with cameras to enable ground truth comparison to each of the magnetometer methods. While the two-magnetometer method accounts for local hard iron distortion, the method fails where the magnetic field direction in space is inconsistent.Keywords: motion tracking, sensor fusion, magnetometer, state estimation
Procedia PDF Downloads 86364 Impact of Traffic Restrictions due to Covid19, on Emissions from Freight Transport in Mexico City
Authors: Oscar Nieto-Garzón, Angélica Lozano
Abstract:
In urban areas, on-road freight transportation creates several social and environmental externalities. Then, it is crucial that freight transport considers not only economic aspects, like retailer distribution cost reduction and service improvement, but also environmental effects such as global CO2 and local emissions (e.g. Particulate Matter, NOX, CO) and noise. Inadequate infrastructure development, high rate of urbanization, the increase of motorization, and the lack of transportation planning are characteristics that urban areas from developing countries share. The Metropolitan Area of Mexico City (MAMC), the Metropolitan Area of São Paulo (MASP), and Bogota are three of the largest urban areas in Latin America where air pollution is often a problem associated with emissions from mobile sources. The effect of the lockdown due to COVID-19 was analyzedfor these urban areas, comparing the same period (January to August) of years 2016 – 2019 with 2020. A strong reduction in the concentration of primary criteria pollutants emitted by road traffic were observed at the beginning of 2020 and after the lockdown measures.Daily mean concentration of NOx decreased 40% in the MAMC, 34% in the MASP, and 62% in Bogota. Daily mean ozone levels increased after the lockdown measures in the three urban areas, 25% in MAMC, 30% in the MASP and 60% in Bogota. These changes in emission patterns from mobile sources drastically changed the ambient atmospheric concentrations of CO and NOX. The CO/NOX ratioat the morning hours is often used as an indicator of mobile sources emissions. In 2020, traffic from cars and light vehicles was significantly reduced due to the first lockdown, but buses and trucks had not restrictions. In theory, it implies a decrease in CO and NOX from cars or light vehicles, maintaining the levels of NOX by trucks(or lower levels due to the congestion reduction). At rush hours, traffic was reduced between 50% and 75%, so trucks could get higher speeds, which would reduce their emissions. By means an emission model, it was found that an increase in the average speed (75%) would reduce the emissions (CO, NOX, and PM) from diesel trucks by up to 30%. It was expected that the value of CO/NOXratio could change due to thelockdownrestrictions. However, although there was asignificant reduction of traffic, CO/NOX kept its trend, decreasing to 8-9 in 2020. Hence, traffic restrictions had no impact on the CO/NOX ratio, although they did reduce vehicle emissions of CO and NOX. Therefore, these emissions may not adequately represent the change in the vehicle emission patterns, or this ratio may not be a good indicator of emissions generated by vehicles. From the comparison of the theoretical data and those observed during the lockdown, results that the real NOX reduction was lower than the theoretical reduction. The reasons could be that there are other sources of NOX emissions, so there would be an over-representation of NOX emissions generated by diesel vehicles, or there is an underestimation of CO emissions. Further analysis needs to consider this ratioto evaluate the emission inventories and then to extend these results forthe determination of emission control policies to non-mobile sources.Keywords: COVID-19, emissions, freight transport, latin American metropolis
Procedia PDF Downloads 137363 Prosodic Transfer in Foreign Language Learning: A Phonetic Crosscheck of Intonation and F₀ Range between Italian and German Native and Non-Native Speakers
Authors: Violetta Cataldo, Renata Savy, Simona Sbranna
Abstract:
Background: Foreign Language Learning (FLL) is characterised by prosodic transfer phenomena regarding pitch accents placement, intonation patterns, and pitch range excursion from the learners’ mother tongue to their Foreign Language (FL) which suggests that the gradual development of general linguistic competence in FL does not imply an equally correspondent improvement of the prosodic competence. Topic: The present study aims to monitor the development of prosodic competence of learners of Italian and German throughout the FLL process. The primary object of this study is to investigate the intonational features and the f₀ range excursion of Italian and German from a cross-linguistic perspective; analyses of native speakers’ productions point out the differences between this pair of languages and provide models for the Target Language (TL). A following crosscheck compares the L2 productions in Italian and German by non-native speakers to the Target Language models, in order to verify the occurrence of prosodic interference phenomena, i.e., type, degree, and modalities. Methodology: The subjects of the research are university students belonging to two groups: Italian native speakers learning German as FL and German native speakers learning Italian as FL. Both of them have been divided into three subgroups according to the FL proficiency level (beginners, intermediate, advanced). The dataset consists of wh-questions placed in situational contexts uttered in both speakers’ L1 and FL. Using a phonetic approach, analyses have considered three domains of intonational contours (Initial Profile, Nuclear Accent, and Terminal Contour) and two dimensions of the f₀ range parameter (span and level), which provide a basis for comparison between L1 and L2 productions. Findings: Results highlight a strong presence of prosodic transfer phenomena affecting L2 productions in the majority of both Italian and German learners, irrespective of their FL proficiency level; the transfer concerns all the three domains of the contour taken into account, although with different modalities and characteristics. Currently, L2 productions of German learners show a pitch span compression on the domain of the Terminal Contour compared to their L1 towards the TL; furthermore, German learners tend to use lower pitch range values in deviation from their L1 when improving their general linguistic competence in Italian FL proficiency level. Results regarding pitch range span and level in L2 productions by Italian learners are still in progress. At present, they show a similar tendency to expand the pitch span and to raise the pitch level, which also reveals a deviation from the L1 possibly in the direction of German TL. Conclusion: Intonational features seem to be 'resistant' parameters to which learners appear not to be particularly sensitive. By contrast, they show a certain sensitiveness to FL pitch range dimensions. Making clear which the most resistant and the most sensitive parameters are when learning FL prosody could lay groundwork for the development of prosodic trainings thanks to which learners could finally acquire a clear and natural pronunciation and intonation.Keywords: foreign language learning, German, Italian, L2 prosody, pitch range, transfer
Procedia PDF Downloads 286362 Impact Analysis of a School-Based Oral Health Program in Brazil
Authors: Fabio L. Vieira, Micaelle F. C. Lemos, Luciano C. Lemos, Rafaela S. Oliveira, Ian A. Cunha
Abstract:
Brazil has some challenges ahead related to population oral health, most of them associated with the need of expanding into the local level its promotion and prevention activities, offer equal access to services and promote changes in the lifestyle of the population. The program implemented an oral health initiative in public schools in the city of Salvador, Bahia. The mission was to improve oral health among students on primary and secondary education, from 2 to 15 years old, using the school as a pathway to increase access to healthcare. The main actions consisted of a team's visit to the schools with educational sessions for dental cavity prevention and individual assessment. The program incorporated a clinical surveillance component through a dental evaluation of every student searching for dental disease and caries, standardization of the dentists’ team to reach uniform classification on the assessments, and the use of an online platform to register data directly from the schools. Sequentially, the students with caries were referred for free clinical treatment on the program’s Health Centre. The primary purpose of this study was to analyze the effects and outcomes of this school-based oral health program. The study sample was composed by data of a period of 3 years - 2015 to 2017 - from 13 public schools on the suburb of the city of Salvador with a total number of assessments of 9,278 on this period. From the data collected the prevalence of children with decay on permanent teeth was chosen as the most reliable indicator. The prevalence was calculated for each one of the 13 schools using the number of children with 1 or more dental caries on permanent teeth divided by the total number of students assessed for school each year. Then the percentage change per year was calculated for each school. Some schools presented a higher variation on the total number of assessments in one of the three years, so for these, the percentage change calculation was done using the two years with less variation. The results show that 10 of the 13 schools presented significative improvements for the indicator of caries in permanent teeth. The mean for the number of students with caries percentage reduction on the 13 schools was 26.8%, and the median was 32.2% caries in permanent teeth institution. The highest percentage of improvement reached a decrease of 65.6% on the indicator. Three schools presented a rise in caries prevalence (8.9, 18.9 and 37.2% increase) that, on an initial analysis, seems to be explained with the students’ cohort rotation among other schools, as well as absenteeism on the treatment. In conclusion, the program shows a relevant impact on the reduction of caries in permanent teeth among students and the need for the continuity and expansion of this integrated healthcare approach. It has also been evident the significative of the articulation between health and educational systems representing a fundamental approach to improve healthcare access for children especially in scenarios such as presented in Brazil.Keywords: primary care, public health, oral health, school-based oral health, data management
Procedia PDF Downloads 136361 Cockpit Integration and Piloted Assessment of an Upset Detection and Recovery System
Authors: Hafid Smaili, Wilfred Rouwhorst, Paul Frost
Abstract:
The trend of recent accident and incident cases worldwide show that the state-of-the-art automation and operations, for current and future demanding operational environments, does not provide the desired level of operational safety under crew peak workload conditions, specifically in complex situations such as loss-of-control in-flight (LOC-I). Today, the short term focus is on preparing crews to recognise and handle LOC-I situations through upset recovery training. This paper describes the cockpit integration aspects and piloted assessment of both a manually assisted and automatic upset detection and recovery system that has been developed and demonstrated within the European Advanced Cockpit for Reduction Of StreSs and workload (ACROSS) programme. The proposed system is a function that continuously monitors and intervenes when the aircraft enters an upset and provides either manually pilot-assisted guidance or takes over full control of the aircraft to recover from an upset. In order to mitigate the highly physical and psychological impact during aircraft upset events, the system provides new cockpit functionalities to support the pilot in recovering from any upset both manually assisted and automatically. A piloted simulator assessment was made in Oct-Nov 2015 using ten pilots in a representative civil large transport fly-by-wire aircraft in terms of the preference of the tested upset detection and recovery system configurations to reduce pilot workload, increase situational awareness and safe interaction with the manually assisted or automated modes. The piloted simulator evaluation of the upset detection and recovery system showed that the functionalities of the system are able to support pilots during an upset. The experiment showed that pilots are willing to rely on the guidance provided by the system during an upset. Thereby, it is important for pilots to see and understand what the aircraft is doing and trying to do especially in automatic modes. Comparing the manually assisted and the automatic recovery modes, the pilot’s opinion was that an automatic recovery reduces the workload so that they could perform a proper screening of the primary flight display. The results further show that the manually assisted recoveries, with recovery guidance cues on the cockpit primary flight display, reduced workload for severe upsets compared to today’s situation. The level of situation awareness was improved for automatic upset recoveries where the pilot could monitor what the system was trying to accomplish compared to automatic recovery modes without any guidance. An improvement in situation awareness was also noticeable with the manually assisted upset recovery functionalities as compared to the current non-assisted recovery procedures. This study shows that automatic upset detection and recovery functionalities are likely to positively impact the operational safety by means of reduced workload, improved situation awareness and crew stress reduction. It is thus believed that future developments for upset recovery guidance and loss-of-control prevention should focus on automatic recovery solutions.Keywords: aircraft accidents, automatic flight control, loss-of-control, upset recovery
Procedia PDF Downloads 210360 Cobb Angle Measurement from Coronal X-Rays Using Artificial Neural Networks
Authors: Andrew N. Saylor, James R. Peters
Abstract:
Scoliosis is a complex 3D deformity of the thoracic and lumbar spines, clinically diagnosed by measurement of a Cobb angle of 10 degrees or more on a coronal X-ray. The Cobb angle is the angle made by the lines drawn along the proximal and distal endplates of the respective proximal and distal vertebrae comprising the curve. Traditionally, Cobb angles are measured manually using either a marker, straight edge, and protractor or image measurement software. The task of measuring the Cobb angle can also be represented by a function taking the spine geometry rendered using X-ray imaging as input and returning the approximate angle. Although the form of such a function may be unknown, it can be approximated using artificial neural networks (ANNs). The performance of ANNs is affected by many factors, including the choice of activation function and network architecture; however, the effects of these parameters on the accuracy of scoliotic deformity measurements are poorly understood. Therefore, the objective of this study was to systematically investigate the effect of ANN architecture and activation function on Cobb angle measurement from the coronal X-rays of scoliotic subjects. The data set for this study consisted of 609 coronal chest X-rays of scoliotic subjects divided into 481 training images and 128 test images. These data, which included labeled Cobb angle measurements, were obtained from the SpineWeb online database. In order to normalize the input data, each image was resized using bi-linear interpolation to a size of 500 × 187 pixels, and the pixel intensities were scaled to be between 0 and 1. A fully connected (dense) ANN with a fixed cost function (mean squared error), batch size (10), and learning rate (0.01) was developed using Python Version 3.7.3 and TensorFlow 1.13.1. The activation functions (sigmoid, hyperbolic tangent [tanh], or rectified linear units [ReLU]), number of hidden layers (1, 3, 5, or 10), and number of neurons per layer (10, 100, or 1000) were varied systematically to generate a total of 36 network conditions. Stochastic gradient descent with early stopping was used to train each network. Three trials were run per condition, and the final mean squared errors and mean absolute errors were averaged to quantify the network response for each condition. The network that performed the best used ReLU neurons had three hidden layers, and 100 neurons per layer. The average mean squared error of this network was 222.28 ± 30 degrees2, and the average mean absolute error was 11.96 ± 0.64 degrees. It is also notable that while most of the networks performed similarly, the networks using ReLU neurons, 10 hidden layers, and 1000 neurons per layer, and those using Tanh neurons, one hidden layer, and 10 neurons per layer performed markedly worse with average mean squared errors greater than 400 degrees2 and average mean absolute errors greater than 16 degrees. From the results of this study, it can be seen that the choice of ANN architecture and activation function has a clear impact on Cobb angle inference from coronal X-rays of scoliotic subjects.Keywords: scoliosis, artificial neural networks, cobb angle, medical imaging
Procedia PDF Downloads 131359 Vehicle Timing Motion Detection Based on Multi-Dimensional Dynamic Detection Network
Authors: Jia Li, Xing Wei, Yuchen Hong, Yang Lu
Abstract:
Detecting vehicle behavior has always been the focus of intelligent transportation, but with the explosive growth of the number of vehicles and the complexity of the road environment, the vehicle behavior videos captured by traditional surveillance have been unable to satisfy the study of vehicle behavior. The traditional method of manually labeling vehicle behavior is too time-consuming and labor-intensive, but the existing object detection and tracking algorithms have poor practicability and low behavioral location detection rate. This paper proposes a vehicle behavior detection algorithm based on the dual-stream convolution network and the multi-dimensional video dynamic detection network. In the videos, the straight-line behavior of the vehicle will default to the background behavior. The Changing lanes, turning and turning around are set as target behaviors. The purpose of this model is to automatically mark the target behavior of the vehicle from the untrimmed videos. First, the target behavior proposals in the long video are extracted through the dual-stream convolution network. The model uses a dual-stream convolutional network to generate a one-dimensional action score waveform, and then extract segments with scores above a given threshold M into preliminary vehicle behavior proposals. Second, the preliminary proposals are pruned and identified using the multi-dimensional video dynamic detection network. Referring to the hierarchical reinforcement learning, the multi-dimensional network includes a Timer module and a Spacer module, where the Timer module mines time information in the video stream and the Spacer module extracts spatial information in the video frame. The Timer and Spacer module are implemented by Long Short-Term Memory (LSTM) and start from an all-zero hidden state. The Timer module uses the Transformer mechanism to extract timing information from the video stream and extract features by linear mapping and other methods. Finally, the model fuses time information and spatial information and obtains the location and category of the behavior through the softmax layer. This paper uses recall and precision to measure the performance of the model. Extensive experiments show that based on the dataset of this paper, the proposed model has obvious advantages compared with the existing state-of-the-art behavior detection algorithms. When the Time Intersection over Union (TIoU) threshold is 0.5, the Average-Precision (MP) reaches 36.3% (the MP of baselines is 21.5%). In summary, this paper proposes a vehicle behavior detection model based on multi-dimensional dynamic detection network. This paper introduces spatial information and temporal information to extract vehicle behaviors in long videos. Experiments show that the proposed algorithm is advanced and accurate in-vehicle timing behavior detection. In the future, the focus will be on simultaneously detecting the timing behavior of multiple vehicles in complex traffic scenes (such as a busy street) while ensuring accuracy.Keywords: vehicle behavior detection, convolutional neural network, long short-term memory, deep learning
Procedia PDF Downloads 132358 Environmental Impact of a New-Build Educational Building in England: Life-Cycle Assessment as a Method to Calculate Whole Life Carbon Emissions
Authors: Monkiz Khasreen
Abstract:
In the context of the global trend towards reducing new buildings carbon footprint, the design team is required to make early decisions that have a major influence on embodied and operational carbon. Sustainability strategies should be clear during early stages of building design process, as changes made later can be extremely costly. Life-Cycle Assessment (LCA) could be used as the vehicle to carry other tools and processes towards achieving the requested improvement. Although LCA is the ‘golden standard’ to evaluate buildings from 'cradle to grave', lack of details available on the concept design makes LCA very difficult, if not impossible, to be used as an estimation tool at early stages. Issues related to transparency and accessibility of information in the building industry are affecting the credibility of LCA studies. A verified database derived from LCA case studies is required to be accessible to researchers, design professionals, and decision makers in order to offer guidance on specific areas of significant impact. This database could be the build-up of data from multiple sources within a pool of research held in this context. One of the most important factors that affects the reliability of such data is the temporal factor as building materials, components, and systems are rapidly changing with the advancement of technology making production more efficient and less environmentally harmful. Recent LCA studies on different building functions, types, and structures are always needed to update databases derived from research and to form case bases for comparison studies. There is also a need to make these studies transparent and accessible to designers. The work in this paper sets out to address this need. This paper also presents life-cycle case study of a new-build educational building in England. The building utilised very current construction methods and technologies and is rated as BREEAM excellent. Carbon emissions of different life-cycle stages and different building materials and components were modelled. Scenario and sensitivity analyses were used to estimate the future of new educational buildings in England. The study attempts to form an indicator during the early design stages of similar buildings. Carbon dioxide emissions of this case study building, when normalised according to floor area, lie towards the lower end of the range of worldwide data reported in the literature. Sensitivity analysis shows that life cycle assessment results are highly sensitive to future assumptions made at the design stage, such as future changes in electricity generation structure over time, refurbishment processes and recycling. The analyses also prove that large savings in carbon dioxide emissions can result from very small changes at the design stage.Keywords: architecture, building, carbon dioxide, construction, educational buildings, England, environmental impact, life-cycle assessment
Procedia PDF Downloads 113357 Phonological Processing and Its Role in Pseudo-Word Decoding in Children Learning to Read Kannada Language between 5.6 to 8.6 Years
Authors: Vangmayee. V. Subban, Somashekara H. S, Shwetha Prabhu, Jayashree S. Bhat
Abstract:
Introduction and Need: Phonological processing is critical in learning to read alphabetical and non-alphabetical languages. However, its role in learning to read Kannada an alphasyllabary is equivocal. The literature has focused on the developmental role of phonological awareness on reading. To the best of authors knowledge, the role of phonological memory and phonological naming has not been addressed in alphasyllabary Kannada language. Therefore, there is a need to evaluate the comprehensive role of the phonological processing skills in Kannada on word decoding skills during the early years of schooling. Aim and Objectives: The present study aimed to explore the phonological processing abilities and their role in learning to decode pseudowords in children learning to read the Kannada language during initial years of formal schooling between 5.6 to 8.6 years. Method: In this cross sectional study, 60 typically developing Kannada speaking children, 20 each from Grade I, Grade II, and Grade III between the age range of 5.6 to 6.6 years, 6.7 to 7.6 years and 7.7 to 8.6 years respectively were selected from Kannada medium schools. Phonological processing abilities were assessed using an assessment tool specifically developed to address the objectives of the present research. The assessment tool was content validated by subject experts and had good inter and intra-subject reliability. Phonological awareness was assessed at syllable level using syllable segmentation, blending, and syllable stripping at initial, medial and final position. Phonological memory was assessed using pseudoword repetition task and phonological naming was assessed using rapid automatized naming of objects. Both phonological awareneness and phonological memory measures were scored for the accuracy of the response, whereas Rapid Automatized Naming (RAN) was scored for total naming speed. Results: The mean scores comparison using one-way ANOVA revealed a significant difference (p ≤ 0.05) between the groups on all the measures of phonological awareness, pseudoword repetition, rapid automatized naming, and pseudoword reading. Subsequent post-hoc grade wise comparison using Bonferroni test revealed significant differences (p ≤ 0.05) between each of the grades for all the tasks except (p ≥ 0.05) for syllable blending, syllable stripping, and pseudoword repetition between Grade II and Grade III. The Pearson correlations revealed a highly significant positive correlation (p=0.000) between all the variables except phonological naming which had significant negative correlations. However, the correlation co-efficient was higher for phonological awareness measures compared to others. Hence, phonological awareness was chosen a first independent variable to enter in the hierarchical regression equation followed by rapid automatized naming and finally, pseudoword repetition. The regression analysis revealed syllable awareness as a single most significant predictor of pseudoword reading by explaining the unique variance of 74% and there was no significant change in R² when RAN and pseudoword repetition were added subsequently to the regression equation. Conclusion: Present study concluded that syllable awareness matures completely by Grade II, whereas the phonological memory and phonological naming continue to develop beyond Grade III. Amongst phonological processing skills, phonological awareness, especially syllable awareness is crucial for word decoding than phonological memory and naming during initial years of schooling.Keywords: phonological awareness, phonological memory, phonological naming, phonological processing, pseudo-word decoding
Procedia PDF Downloads 175356 A Geographic Information System Mapping Method for Creating Improved Satellite Solar Radiation Dataset Over Qatar
Authors: Sachin Jain, Daniel Perez-Astudillo, Dunia A. Bachour, Antonio P. Sanfilippo
Abstract:
The future of solar energy in Qatar is evolving steadily. Hence, high-quality spatial solar radiation data is of the uttermost requirement for any planning and commissioning of solar technology. Generally, two types of solar radiation data are available: satellite data and ground observations. Satellite solar radiation data is developed by the physical and statistical model. Ground data is collected by solar radiation measurement stations. The ground data is of high quality. However, they are limited to distributed point locations with the high cost of installation and maintenance for the ground stations. On the other hand, satellite solar radiation data is continuous and available throughout geographical locations, but they are relatively less accurate than ground data. To utilize the advantage of both data, a product has been developed here which provides spatial continuity and higher accuracy than any of the data alone. The popular satellite databases: National Solar radiation Data Base, NSRDB (PSM V3 model, spatial resolution: 4 km) is chosen here for merging with ground-measured solar radiation measurement in Qatar. The spatial distribution of ground solar radiation measurement stations is comprehensive in Qatar, with a network of 13 ground stations. The monthly average of the daily total Global Horizontal Irradiation (GHI) component from ground and satellite data is used for error analysis. The normalized root means square error (NRMSE) values of 3.31%, 6.53%, and 6.63% for October, November, and December 2019 were observed respectively when comparing in-situ and NSRDB data. The method is based on the Empirical Bayesian Kriging Regression Prediction model available in ArcGIS, ESRI. The workflow of the algorithm is based on the combination of regression and kriging methods. A regression model (OLS, ordinary least square) is fitted between the ground and NSBRD data points. A semi-variogram is fitted into the experimental semi-variogram obtained from the residuals. The kriging residuals obtained after fitting the semi-variogram model were added to NSRBD data predicted values obtained from the regression model to obtain the final predicted values. The NRMSE values obtained after merging are respectively 1.84%, 1.28%, and 1.81% for October, November, and December 2019. One more explanatory variable, that is the ground elevation, has been incorporated in the regression and kriging methods to reduce the error and to provide higher spatial resolution (30 m). The final GHI maps have been created after merging, and NRMSE values of 1.24%, 1.28%, and 1.28% have been observed for October, November, and December 2019, respectively. The proposed merging method has proven as a highly accurate method. An additional method is also proposed here to generate calibrated maps by using regression and kriging model and further to use the calibrated model to generate solar radiation maps from the explanatory variable only when not enough historical ground data is available for long-term analysis. The NRMSE values obtained after the comparison of the calibrated maps with ground data are 5.60% and 5.31% for November and December 2019 month respectively.Keywords: global horizontal irradiation, GIS, empirical bayesian kriging regression prediction, NSRDB
Procedia PDF Downloads 89355 Improving Binding Selectivity in Molecularly Imprinted Polymers from Templates of Higher Biomolecular Weight: An Application in Cancer Targeting and Drug Delivery
Authors: Ben Otange, Wolfgang Parak, Florian Schulz, Michael Alexander Rubhausen
Abstract:
The feasibility of extending the usage of molecular imprinting technique in complex biomolecules is demonstrated in this research. This technique is promising in diverse applications in areas such as drug delivery, diagnosis of diseases, catalysts, and impurities detection as well as treatment of various complications. While molecularly imprinted polymers MIP remain robust in the synthesis of molecules with remarkable binding sites that have high affinities to specific molecules of interest, extending the usage to complex biomolecules remains futile. This work reports on the successful synthesis of MIP from complex proteins: BSA, Transferrin, and MUC1. We show in this research that despite the heterogeneous binding sites and higher conformational flexibility of the chosen proteins, relying on their respective epitopes and motifs rather than the whole template produces highly sensitive and selective MIPs for specific molecular binding. Introduction: Proteins are vital in most biological processes, ranging from cell structure and structural integrity to complex functions such as transport and immunity in biological systems. Unlike other imprinting templates, proteins have heterogeneous binding sites in their complex long-chain structure, which makes their imprinting to be marred by challenges. In addressing this challenge, our attention is inclined toward the targeted delivery, which will use molecular imprinting on the particle surface so that these particles may recognize overexpressed proteins on the target cells. Our goal is thus to make surfaces of nanoparticles that specifically bind to the target cells. Results and Discussions: Using epitopes of BSA and MUC1 proteins and motifs with conserved receptors of transferrin as the respective templates for MIPs, significant improvement in the MIP sensitivity to the binding of complex protein templates was noted. Through the Fluorescence Correlation Spectroscopy FCS measurements on the size of protein corona after incubation of the synthesized nanoparticles with proteins, we noted a high affinity of MIPs to the binding of their respective complex proteins. In addition, quantitative analysis of hard corona using SDS-PAGE showed that only a specific protein was strongly bound on the respective MIPs when incubated with similar concentrations of the protein mixture. Conclusion: Our findings have shown that the merits of MIPs can be extended to complex molecules of higher biomolecular mass. As such, the unique merits of the technique, including high sensitivity and selectivity, relative ease of synthesis, production of materials with higher physical robustness, and higher stability, can be extended to more templates that were previously not suitable candidates despite their abundance and usage within the body.Keywords: molecularly imprinted polymers, specific binding, drug delivery, high biomolecular mass-templates
Procedia PDF Downloads 55354 'You’re Not Alone': Peer Feedback Practices for Cross-Cultural Writing Classrooms and Centers
Authors: Cassandra Branham, Danielle Farrar
Abstract:
As writing instructors and writing center administrators at a large research university with a significant population of English language learners (ELLs), we are interested in how peer feedback pedagogy can be effectively translated for writing center purposes, as well as how various modes of peer feedback can enrich the learning experiences of L1 and L2 writers in these spaces. Although peer feedback is widely used in classrooms and centers, instructor, student, and researcher opinions vary in respect to its effectiveness. We argue that peer feedback - traditional and digital, synchronous and asynchronous - is an indispensable element for both classrooms and centers and emphasize that it should occur with both L1 and L2 students to further develop an array of reading and writing skills. We also believe that further understanding of the best practices of peer feedback in such cross-cultural spaces, like the classroom and center, can optimize the benefits of peer feedback. After a critical review of the literature, we implemented an embedded tutoring program in our university’s writing center in collaboration with its First-Year Composition (FYC) program and Language Institute. The embedded tutoring program matches a graduate writing consultant with L1 and L2 writers enrolled in controlled-matriculation composition courses where ELLs make up at least 50% of each class. Furthermore, this program is informed by what we argue to be some best practices of peer feedback for both classroom and center purposes, including expectation-based training through rubrics, modeling effective feedback, hybridizing traditional and digital modes of feedback, recognizing the significance the body in composition (what we call writer embodiment), and maximizing digital technologies to exploit extended cognition. After conducting surveys and follow-up interviews with students, instructors, and writing consultants in the embedded tutoring program, we found that not only did students see an increased value in peer feedback, but also instructors saw an improvement in both writing style and critical thinking skills. Our L2 participants noted improvements in language acquisition while our L1 students recognized a broadening of their worldviews. We believe that both L1 and L2 students developed self-efficacy and agency in their identities as writers because they gained confidence in their abilities to offer feedback, as well as in the legitimacy of feedback they received from peers. We also argue that these best practices situate novice writers as experts, as writers become a valued and integral part of the revision process with their own and their peers’ papers. Finally, the use of iPads in embedded tutoring recovered the importance of the body and its senses in writing; the highly sensory feedback from these multi-modal sessions that offer audio and visual input underscores the significant role both the body and mind play in compositional practices. After beginning with a brief review of the literature that sparked this research, this paper will discuss the embedded tutoring program in detail, report on the results of the pilot program, and will conclude with a discussion of the pedagogical implications that arise from this research for both classroom and center.Keywords: English language learners, peer feedback, writing center, writing classroom
Procedia PDF Downloads 403353 Stock Prediction and Portfolio Optimization Thesis
Authors: Deniz Peksen
Abstract:
This thesis aims to predict trend movement of closing price of stock and to maximize portfolio by utilizing the predictions. In this context, the study aims to define a stock portfolio strategy from models created by using Logistic Regression, Gradient Boosting and Random Forest. Recently, predicting the trend of stock price has gained a significance role in making buy and sell decisions and generating returns with investment strategies formed by machine learning basis decisions. There are plenty of studies in the literature on the prediction of stock prices in capital markets using machine learning methods but most of them focus on closing prices instead of the direction of price trend. Our study differs from literature in terms of target definition. Ours is a classification problem which is focusing on the market trend in next 20 trading days. To predict trend direction, fourteen years of data were used for training. Following three years were used for validation. Finally, last three years were used for testing. Training data are between 2002-06-18 and 2016-12-30 Validation data are between 2017-01-02 and 2019-12-31 Testing data are between 2020-01-02 and 2022-03-17 We determine Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate as benchmarks which we should outperform. We compared our machine learning basis portfolio return on test data with return of Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate. We assessed our model performance with the help of roc-auc score and lift charts. We use logistic regression, Gradient Boosting and Random Forest with grid search approach to fine-tune hyper-parameters. As a result of the empirical study, the existence of uptrend and downtrend of five stocks could not be predicted by the models. When we use these predictions to define buy and sell decisions in order to generate model-based-portfolio, model-based-portfolio fails in test dataset. It was found that Model-based buy and sell decisions generated a stock portfolio strategy whose returns can not outperform non-model portfolio strategies on test dataset. We found that any effort for predicting the trend which is formulated on stock price is a challenge. We found same results as Random Walk Theory claims which says that stock price or price changes are unpredictable. Our model iterations failed on test dataset. Although, we built up several good models on validation dataset, we failed on test dataset. We implemented Random Forest, Gradient Boosting and Logistic Regression. We discovered that complex models did not provide advantage or additional performance while comparing them with Logistic Regression. More complexity did not lead us to reach better performance. Using a complex model is not an answer to figure out the stock-related prediction problem. Our approach was to predict the trend instead of the price. This approach converted our problem into classification. However, this label approach does not lead us to solve the stock prediction problem and deny or refute the accuracy of the Random Walk Theory for the stock price.Keywords: stock prediction, portfolio optimization, data science, machine learning
Procedia PDF Downloads 81352 Strategies of Risk Management for Smallholder Farmers in South Africa: A Case Study on Pigeonpea (Cajanus cajan) Production
Authors: Sanari Chalin Moriri, Kwabena Kingsley Ayisi, Alina Mofokeng
Abstract:
Dryland smallholder farmers in South Africa are vulnerable to all kinds of risks, and it negatively affects crop productivity and profit. Pigeonpea is a leguminous and multipurpose crop that provides food, fodder, and wood for smallholder farmers. The majority of these farmers are still growing pigeonpea from traditional unimproved seeds, which comprise a mixture of genotypes. The objectives of the study were to identify the key risk factors that affect pigeonpea productivity and to develop management strategies on how to alleviate the risk factors in pigeonpea production. The study was conducted in two provinces (Limpopo and Mpumalanga) of South Africa in six municipalities during the 2020/2021 growing seasons. The non-probability sampling method using purposive and snowball sampling techniques were used to collect data from the farmers through a structured questionnaire. A total of 114 pigeonpea producers were interviewed individually using a questionnaire. Key stakeholders in each municipality were also identified, invited, and interviewed to verify the information given by farmers. Data collected were subjected to SPSS statistical software 25 version. The findings of the study were that majority of farmers affected by risk factors were women, subsistence, and old farmers resulted in low food production. Drought, unavailability of improved pigeonpea seeds for planting, access to information, and processing equipment were found to be the main risk factors contributing to low crop productivity in farmer’s fields. Above 80% of farmers lack knowledge on the improvement of the crop and also on the processing techniques to secure high prices during the crop off-season. Market availability, pricing, and incidence of pests and diseases were found to be minor risk factors which were triggered by the major risk factors. The minor risk factors can be corrected only if the major risk factors are first given the necessary attention. About 10% of the farmers found to use the crop as a mulch to reduce soil temperatures and to improve soil fertility. The study revealed that most of the farmers were unaware of its utilisation as fodder, much, medicinal, nitrogen fixation, and many more. The risk of frequent drought in dry areas of South Africa where farmers solely depend on rainfall poses a serious threat to crop productivity. The majority of these risk factors are caused by climate change due to unrealistic, low rainfall with extreme temperatures poses a threat to food security, water, and the environment. The use of drought-tolerant, multipurpose legume crops such as pigeonpea, access to new information, provision of processing equipment, and support from all stakeholders will help in addressing food security for smallholder farmers. Policies should be revisited to address the prevailing risk factors faced by farmers and involve them in addressing the risk factors. Awareness should be prioritized in promoting the crop to improve its production and commercialization in the dryland farming system of South Africa.Keywords: management strategies, pigeonpea, risk factors, smallholder farmers
Procedia PDF Downloads 213351 An Integrative Review on Effects of Educational Interventions for Children with Eczema
Authors: Nam Sze Cheng, P. C. Janita Chau
Abstract:
Background: Eczema is a chronic inflammatory disease with high global prevalence rates in many childhood populations. It is also the most common paediatric skin problem. Although eczema education and proper skin care were effective in controlling eczema symptoms, the lack of both sufficient time for patient consultation and structured eczema education programme hindered the transferability of knowledge to patients and their parents. As a result, these young patients and their families suffer from a significant physical disability and psychological distress, which can substantially impair their quality of life. Objectives: This integrative review is to examine the effects of educational interventions for children with eczema and identify the core elements associated with an effective intervention. Methods: This integrative review targeted all articles published in 10 databases between May 2016 and February 2017 that reported the outcomes of disease interventions of any format for children and adolescents with the clinical diagnosis of eczema who were under 18 years of age. Five randomized controlled trials (RCT) and one systematic review of 10 RCTs were identified for review. All these publications had high methodological quality, except one study of web-based eczema education that was limited by selection bias and poor subject blinding. Findings: This review found that most studies adopted nurse-led or multi-disciplinary parental eczema education programme at the outpatient clinic setting. The format of these programmes included individual lectures, demonstration and group sharing, and the educational materials covered basic eczema knowledge and management as well as methods to interrupt itch-scratch cycle. The main outcome measures of these studies included severity of eczema symptoms, treatment adherence and quality of life of both patients and their families. Nine included studies reported statistically significant improvement in the primary outcome of symptom severity of these eczematous children. On the other hand, all these reviews failed to identify an effective dosage of intervention under these educational programmes that was attributed to the heterogeneity of the interventions. One study that was designed based on the social cognitive theory to guide the interventional content yielded statistically significant results. The systematic review recommended the importance of measuring parental self-efficacy. Implication: This integrative review concludes that structured educational programme can help nurses understand the theories behind different health interventions. They can then deliver eczema education to their patients in a consistent manner. These interventions also result in behavioral changes through patient education. Due to the lack of validated educational programmes in Chinese, it is imperative to conduct an RCT of eczema educational programme to investigate its effects on eczema severity, quality of life and treatment adherence in Hong Kong children as well as to promote the importance of parental self-efficacy.Keywords: children, eczema, education, intervention
Procedia PDF Downloads 118350 The Effects of Aging on Visuomotor Behaviors in Reaching
Authors: Mengjiao Fan, Thomson W. L. Wong
Abstract:
It is unavoidable that older adults may have to deal with aging-related motor problems. Aging is highly likely to affect motor learning and control as well. For example, older adults may suffer from poor motor function and quality of life due to age-related eye changes. These adverse changes in vision results in impairment of movement automaticity. Reaching is a fundamental component of various complex movements, which is therefore beneficial to explore the changes and adaptation in visuomotor behaviors. The current study aims to explore how aging affects visuomotor behaviors by comparing motor performance and gaze behaviors between two age groups (i.e., young and older adults). Visuomotor behaviors in reaching under providing or blocking online visual feedback (simulated visual deficiency) conditions were investigated in 60 healthy young adults (Mean age=24.49 years, SD=2.12) and 37 older adults (Mean age=70.07 years, SD=2.37) with normal or corrected-to-normal vision. Participants in each group were randomly allocated into two subgroups. Subgroup 1 was provided with online visual feedback of the hand-controlled mouse cursor. However, in subgroup 2, visual feedback was blocked to simulate visual deficiency. The experimental task required participants to complete 20 times of reaching to a target by controlling the mouse cursor on the computer screen. Among all the 20 trials, start position was upright in the center of the screen and target appeared at a randomly selected position by the tailor-made computer program. Primary outcomes of motor performance and gaze behaviours data were recorded by the EyeLink II (SR Research, Canada). The results suggested that aging seems to affect the performance of reaching tasks significantly in both visual feedback conditions. In both age groups, blocking online visual feedback of the cursor in reaching resulted in longer hand movement time (p < .001), longer reaching distance away from the target center (p<.001) and poorer reaching motor accuracy (p < .001). Concerning gaze behaviors, blocking online visual feedback increased the first fixation duration time in young adults (p<.001) but decreased it in older adults (p < .001). Besides, under the condition of providing online visual feedback of the cursor, older adults conducted a longer fixation dwell time on target throughout reaching than the young adults (p < .001) although the effect was not significant under blocking online visual feedback condition (p=.215). Therefore, the results suggested that different levels of visual feedback during movement execution can affect gaze behaviors differently in older and young adults. Differential effects by aging on visuomotor behaviors appear on two visual feedback patterns (i.e., blocking or providing online visual feedback of hand-controlled cursor in reaching). Several specific gaze behaviors among the older adults were found, which imply that blocking of visual feedback may act as a stimulus to seduce extra perceptive load in movement execution and age-related visual degeneration might further deteriorate the situation. It indeed provides us with insight for the future development of potential rehabilitative training method (e.g., well-designed errorless training) in enhancing visuomotor adaptation for our aging population in the context of improving their movement automaticity by facilitating their compensation of visual degeneration.Keywords: aging effect, movement automaticity, reaching, visuomotor behaviors, visual degeneration
Procedia PDF Downloads 312349 Expanding Behavioral Crisis Care: Expansion of Psychiatric and Addiction-Care Services through a 23/7 Behavioral Crisis Center
Authors: Garima Singh
Abstract:
Objectives: Behavioral Crisis Center (BCC) is a community solution to a community problem. There has been an exponential increase in the incidence and prevalence of mental health crises around the world. The effects of the crisis negatively impact our patients and their families and strain the law enforcement and emergency room. The goal of the multi-disciplinary care model is to break the crisis cycle and provide 24-7 rapid access to an acre and crisis stabilization. We initiated our first BCC care center in 2020 in the midst of the COVID pandemic and have seen a remarkable improvement in patient ‘care and positive financial outcome. Background: Mental illnesses are common in the United States. Nearly one in five U.S. adults live with a mental illness (52.9 million in 2020). This number represented 21.0% of all U.S. adults. To address some of these challenges and help our community, In May 2020, we opened our first Behavioral crisis center (BCC). Since then, we have served more than 2500 patients and is the first southwest Missouri’s first 24/7 facility for crisis–level behavioral health and substance use needs. It has been proven to be a more effective place than emergency departments, jails, or local law enforcement. Methods: BCC was started in 2020 to serve the unmet need of the community and provide access to behavioral health and substance use services identified in the community. Funding was possible with significant investment from the county and Missouri Foundation for Health, with contributions from medical partners. It is a multi-disciplinary care center consisting of Physicians, nurse practitioners, nurses, behavioral technicians, peer support specialists, clinical intake specialists, and clinical coordinators and hospitality specialists. The center provides services including psychiatry care, outpatient therapy, community support services, primary care, peer support and engagement. It is connected to a residential treatment facility for substance use treatment for continuity of care and bridging the gap, which has resulted in the completion of treatment and better outcomes. Results: BCC has proven to be a great resource to the community and the Missouri Health Coalition is providing funding to replicate the model in other regions and work on a similar model for children and adolescents. Overall, 29% of the patients seen at BCC are stabilized and discharged with outpatient care. 50% needed acute stabilization in a hospital setting and 21% required long-term admission, mostly for substance use treatment. The local emergency room had a 42% reduction in behavioral health encounters compared to the previous 3 years. Also, by a quick transfer to BCC, the average stay in ER was reduced by 10 hours and time to follow up behavioral health assessment decreased by an average of 4 hours. Uninsured patients are also provided Medicaid application assistance which has benefited 55% of individuals receiving care at BCC. Conclusions: BCC is impacting community health and improving access to quality care and substance use treatment. It is a great investment for our patients and families.Keywords: BCC, behvaioral health, community health care, addiction treatment
Procedia PDF Downloads 77348 University Curriculum Policy Processes in Chile: A Case Study
Authors: Victoria C. Valdebenito
Abstract:
Located within the context of accelerating globalization in the 21st-century knowledge society, this paper focuses on one selected university in Chile at which radical curriculum policy changes have been taking place, diverging from the traditional curriculum in Chile at the undergraduate level as a section of a larger investigation. Using a ‘policy trajectory’ framework, and guided by the interpretivist approach to research, interview transcripts and institutional documents were analyzed in relation to the meso (university administration) and the micro (academics) level. Inside the case study, participants from the university administration and academic levels were selected both via snow-ball technique and purposive selection, thus they had different levels of seniority, with some participating actively in the curriculum reform processes. Guided by an interpretivist approach to research, documents and interview transcripts were analyzed to reveal major themes emerging from the data. A further ‘bigger picture’ analysis guided by critical theory was then undertaken, involving interrogation of underlying ideologies and how political and economic interests influence the cultural production of policy. The case-study university was selected because it represents a traditional and old case of university setting in the country, undergoing curriculum changes based on international trends such as the competency model and the liberal arts. Also, it is representative of a particular socioeconomic sector of the country. Access to the university was gained through email contact. Qualitative research methods were used, namely interviews and analysis of institutional documents. In all, 18 people were interviewed. The number was defined by when the saturation criterion was met. Semi-structured interview schedules were based on the four research questions about influences, policy texts, policy enactment and longer-term outcomes. Triangulation of information was used for the analysis. While there was no intention to generalize the specific findings of the case study, the results of the research were used as a focus for engagement with broader themes, often evident in global higher education policy developments. The research results were organized around major themes in three of the four contexts of the ‘policy trajectory’. Regarding the context of influences and the context of policy text production, themes relate to hegemony exercised by first world countries’ universities in the higher education field, its associated neoliberal ideology, with accountability and the discourse of continuous improvement, the local responses to those pressures, and the value of interdisciplinarity. Finally, regarding the context of policy practices and effects (enactment), themes emerged around the impacts of the curriculum changes on university staff, students, and resistance amongst academics. The research concluded with a few recommendations that potentially provide ‘food for thought’ beyond the localized settings of this study, as well as possibilities for further research.Keywords: curriculum, global-local dynamics, higher education, policy, sociology of education
Procedia PDF Downloads 79347 Midterm Clinical and Functional Outcomes After Treatment with Ponseti Method for Idiopathic Clubfeet: A Prospective Cohort Study
Authors: Neeraj Vij, Amber Brennan, Jenni Winters, Hadi Salehi, Hamy Temkit, Emily Andrisevic, Mohan V. Belthur
Abstract:
Idiopathic clubfoot is a common lower extremity deformity with an incidence of 1:500. The Ponseti Method is well known as the gold standard of treatment. However, there is limited functional data demonstrating correction of the clubfoot after treatment with the Ponseti method. The purpose of this study was to study the clinical and functional outcomes after the Ponseti method with the Clubfoot Disease-Specific Instrument (CDS) and pedobarography. This IRB-approved prospective study included patients aged 3-18 who were treated for idiopathic clubfoot with the Ponseti method between January 2008 and December 2018. Age-matched controls were identified through siblings of clubfoot patients and other community members. Treatment details were collected through a chart review of the included patients. Laboratory assessment included a physical exam, gait analysis, and pedobarography. The Pediatric Outcomes Data Collection Instrument and the Clubfoot Disease-Specific Instrument were also obtained on clubfoot patients (CF). The Wilcoxson rank-sum test was used to study differences between the CF patients and the typically developing (TD) patients. Statistical significance was set at p < 0.05. There were a total of 37 enrolled patients in our study. 21 were priorly treated for CF and 16 were TD. 94% of the CF patients had bilateral involvement. The age at the start of treatment was 29 days, the average total number of casts was seven to eight, and the average total number of casts after Achilles tenotomy was one. The reoccurrence rate was 25%, tenotomy was required in 94% of patients, and ≥1 tenotomy was required in 25% of patients. There were no significant differences between step length, step width, stride length, force-time integral, maximum peak pressure, foot progression angles, stance phase time, single-limb support time, double limb support time, and gait cycle time between children treated with the Ponseti method and typically developing children. The average post-treatment Pirani and Dimeglio scores were 5.50±0.58 and 15.29±1.58, respectively. The average post-treatment PODCI subscores were: Upper Extremity: 90.28, Transfers: 94.6, Sports: 86.81, Pain: 86.20, Happiness: 89.52, Global: 88.6. The average post-treatment Clubfoot Disease-Specific Instrument scores subscores were: Satisfaction: 73.93, Function: 80.32, Overall: 78.41. The Ponseti Method has a very high success rate and remains to be the gold standard in the treatment of idiopathic clubfoot. Timely management leads to good outcomes and a low need for repeated Achilles tenotomy. Children treated with the Ponseti method demonstrate good functional outcomes as measured through pedobarography. Pedobarography may have clinical utility in studying congenital foot deformities. Objective measures for hours of brace wear could represent an improvement in clubfoot care.Keywords: functional outcomes, pediatric deformity, patient-reported outcomes, talipes equinovarus
Procedia PDF Downloads 80346 Numerical Model of Crude Glycerol Autothermal Reforming to Hydrogen-Rich Syngas
Authors: A. Odoom, A. Salama, H. Ibrahim
Abstract:
Hydrogen is a clean source of energy for power production and transportation. The main source of hydrogen in this research is biodiesel. Glycerol also called glycerine is a by-product of biodiesel production by transesterification of vegetable oils and methanol. This is a reliable and environmentally-friendly source of hydrogen production than fossil fuels. A typical composition of crude glycerol comprises of glycerol, water, organic and inorganic salts, soap, methanol and small amounts of glycerides. Crude glycerol has limited industrial application due to its low purity thus, the usage of crude glycerol can significantly enhance the sustainability and production of biodiesel. Reforming techniques is an approach for hydrogen production mainly Steam Reforming (SR), Autothermal Reforming (ATR) and Partial Oxidation Reforming (POR). SR produces high hydrogen conversions and yield but is highly endothermic whereas POR is exothermic. On the downside, PO yields lower hydrogen as well as large amount of side reactions. ATR which is a fusion of partial oxidation reforming and steam reforming is thermally neutral because net reactor heat duty is zero. It has relatively high hydrogen yield, selectivity as well as limits coke formation. The complex chemical processes that take place during the production phases makes it relatively difficult to construct a reliable and robust numerical model. Numerical model is a tool to mimic reality and provide insight into the influence of the parameters. In this work, we introduce a finite volume numerical study for an 'in-house' lab-scale experiment of ATR. Previous numerical studies on this process have considered either using Comsol or nodal finite difference analysis. Since Comsol is a commercial package which is not readily available everywhere and lab-scale experiment can be considered well mixed in the radial direction. One spatial dimension suffices to capture the essential feature of ATR, in this work, we consider developing our own numerical approach using MATLAB. A continuum fixed bed reactor is modelled using MATLAB with both pseudo homogeneous and heterogeneous models. The drawback of nodal finite difference formulation is that it is not locally conservative which means that materials and momenta can be generated inside the domain as an artifact of the discretization. Control volume, on the other hand, is locally conservative and suites very well problems where materials are generated and consumed inside the domain. In this work, species mass balance, Darcy’s equation and energy equations are solved using operator splitting technique. Therefore, diffusion-like terms are discretized implicitly while advection-like terms are discretized explicitly. An upwind scheme is adapted for the advection term to ensure accuracy and positivity. Comparisons with the experimental data show very good agreements which build confidence in our modeling approach. The models obtained were validated and optimized for better results.Keywords: autothermal reforming, crude glycerol, hydrogen, numerical model
Procedia PDF Downloads 144345 Investigating Sub-daily Responses of Water Flow of Trees in Tropical Successional Forests in Thailand
Authors: Pantana Tor-Ngern
Abstract:
In the global water cycle, tree water use (Tr) largely contributes to evapotranspiration which is the total amount of water evaporated from terrestrial ecosystems to the atmosphere, regulating climates. Tree water use responds to environmental factors, including atmospheric humidity and sunlight (represented by vapor pressure deficit or VPD and photosynthetically active radiation or PAR, respectively) and soil moisture. In forests, Tr responses to such factors depend on species and their spatial and temporal variations. Tropical forests in Southeast Asia (SEA) have experienced land-use conversion from abandoned agricultural practices, resulting in patches of forests at different stages including old-growth and secondary forests. Because the inherent structures, such as canopy height and tree density, significantly vary among forests at different stages and can strongly affect their respective microclimate, Tr and its responses to changing environmental conditions in successional forests may differ. Daily and seasonal variations in the environmental factors may exert significant impacts on the respective Tr patterns. Extrapolating Tr data from short periods of days to longer periods of seasons or years can be complex and is important for estimating long-term ecosystem water use which often includes normal and abnormal climatic conditions. Thus, this study aims to investigate the diurnal variation of Tr, using measured sap flux density (JS) data, with changes in VPD in eight evergreen tree species in an old-growth forest (hereafter OF; >200 years old) and a young forest (hereafter YF, <10 years old) in Khao Yai National Park, Thailand. The studied species included Sysygium syzygoides, Aquilaria crassna, Cinnamomum subavenium, Nephelium melliferum, Altingia excelsa in OF, and Syzygium nervosum and Adinandra integerrima in YF. Only Sysygium antisepticum was found in both forest stages. Specifically, hysteresis, which indicates the asymmetrical changes of JS in response to changing VPD across daily timescale, was examined in these species. Results showed no hysteresis in all species in OF, except Altingia excelsa which exhibited a 3-hour delayed JS response to VPD. In contrast, JS of all species in YF displayed one-hour delayed responses to VPD. The OF species that showed no hysteresis indicated their well-coupling of their canopies with the atmosphere, facilitating the gas exchange which is essential for tree growth. The delayed responses in Altingia excelsa in OF and all species in YF were associated with higher JS in the morning than that in the afternoon. This implies that these species were sensitive to drying air, closing stomata relatively rapidly compared to the decreasing atmospheric humidity (VPD). Such behavior is often observed in trees growing in dry environments. This study suggests that detailed investigation of JS at sub-daily timescales is imperative for better understanding of mechanistic responses of trees to the changing climate, which will benefit the improvement of earth system models.Keywords: sap flow, tropical forest, forest succession, thermal dissipcation probe
Procedia PDF Downloads 60