Search results for: residuals
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 66

Search results for: residuals

66 Kinetics Study for the Recombinant Cellulosome to the Degradation of Chlorella Cell Residuals

Authors: C. C. Lin, S. C. Kan, C. W. Yeh, C. I Chen, C. J. Shieh, Y. C. Liu

Abstract:

In this study, lipid-deprived residuals of microalgae were hydrolyzed for the production of reducing sugars by using the recombinant Bacillus cellulosome, carrying eight genes from the Clostridium thermocellum ATCC27405. The obtained cellulosome was found to exist mostly in the broth supernatant with a cellulosome activity of 2.4 U/mL. Furthermore, the Michaelis-Menten constant (Km) and Vmax of cellulosome were found to be 14.832 g/L and 3.522 U/mL. The activation energy of the cellulosome to hydrolyze microalgae LDRs was calculated as 32.804 kJ/mol.

Keywords: lipid-deprived residuals of microalgae, cellulosome, cellulose, reducing sugars, kinetics

Procedia PDF Downloads 364
65 The Effect of Restaurant Residuals on Performance of Japanese Quail

Authors: A. A. Saki, Y. Karimi, H. J. Najafabadi, P. Zamani, Z. Mostafaie

Abstract:

The restaurant residuals reasons such as competition between human and animal consumption of cereals, increasing environmental pollution and the high cost of production of livestock products is important. Therefore, in this restaurant residuals have a high nutritional value (protein and high energy) that it is possible can replace some of the poultry diets are especially Japanese quail. Today, the challenges of processing and consumption of these lesions occurring in modern industry would be confronting. Increasing costs, pressures, and problems associated with waste excretion, the need for re-evaluation and utilization of waste to livestock and poultry feed fortifies. This study aimed to investigate the effects of different levels of restaurant residuals on performance of 300 layer Japanese quails. This experiment included 5 treatments, 4 replicates, and 15 quails in each from 10 to 18 weeks age in a completely randomized design (CRD). The treatments consist of basal diet including corn and soybean meal (without residual restaurants), and treatments 2, 3, 4 and 5, includes a basal diet containing 5, 10, 15 and 20% of restaurant residuals, respectively. There were no significant effect of restaurant residuals levels on body weight (BW), feed conversion ratio (FCR), percentage of egg production (EP), egg mass (EM) between treatments (P > 0/05). However, feed intake (FI) of 5% restaurant residual was significantly higher than 20% treatment (P < 0/05). Egg weight (EW) was also higher by receiving 20% restaurant residuals compared with 10% in this respect (P < 0/05). Yolk weight (YW) of treatments containing 10 and 20% of the residual restaurant were significantly higher than control (P < 0/05). Eggs white weight (EWW) of 20 and 5% restaurants residual treatments were significantly increased compared by 10% (P < 0/05). Furthermore, EW, egg weight to shell surface area and egg surface area in 20% treatment were significantly higher than control and 10% treatment (P < 0/05). The overall results of this study have shown that restaurant residuals for laying quail diets in levels of 10 and 15 percent could be replaced with a part of the quail ration without any adverse effect.

Keywords: by-product, laying quail, performance, restaurant residuals

Procedia PDF Downloads 136
64 Efficiency of Background Chlorine Residuals against Accidental Microbial Episode in Proto-Type Distribution Network (Rig) Using Central Composite Design (CCD)

Authors: Sajida Rasheed, Imran Hashmi, Luiza Campos, Qizhi Zhou, Kim Keu

Abstract:

A quadratic model (p ˂ 0.0001) was developed by using central composite design of 50 experimental runs (42 non-center + 8 center points) to assess efficiency of background chlorine residuals in combating accidental microbial episode in a prototype distribution network (DN) (rig). A known amount of background chlorine residuals were maintained in DN and a required number of bacteria, Escherichia coli K-12 strain were introduced by an injection port in the pipe loop system. Samples were taken at various time intervals at different pipe lengths. Spread plate count was performed to count bacterial number. The model developed was significant. With microbial concentration and time (p ˂ 0.0001), pipe length (p ˂ 0.022), background chlorine residuals (p ˂ 0.07) and time^2 (p ˂ 0.09) as significant factors. The ramp function of variables shows that at the microbial count of 10^6, at 0.76 L/min, and pipe length of 133 meters, a background residual chlorine 0.16 mg/L was enough for complete inactivation of microbial episode in approximately 18 minutes.

Keywords: central composite design (CCD), distribution network, Escherichia coli, residual chlorine

Procedia PDF Downloads 434
63 A Comparison of Inverse Simulation-Based Fault Detection in a Simple Robotic Rover with a Traditional Model-Based Method

Authors: Murray L. Ireland, Kevin J. Worrall, Rebecca Mackenzie, Thaleia Flessa, Euan McGookin, Douglas Thomson

Abstract:

Robotic rovers which are designed to work in extra-terrestrial environments present a unique challenge in terms of the reliability and availability of systems throughout the mission. Should some fault occur, with the nearest human potentially millions of kilometres away, detection and identification of the fault must be performed solely by the robot and its subsystems. Faults in the system sensors are relatively straightforward to detect, through the residuals produced by comparison of the system output with that of a simple model. However, faults in the input, that is, the actuators of the system, are harder to detect. A step change in the input signal, caused potentially by the loss of an actuator, can propagate through the system, resulting in complex residuals in multiple outputs. These residuals can be difficult to isolate or distinguish from residuals caused by environmental disturbances. While a more complex fault detection method or additional sensors could be used to solve these issues, an alternative is presented here. Using inverse simulation (InvSim), the inputs and outputs of the mathematical model of the rover system are reversed. Thus, for a desired trajectory, the corresponding actuator inputs are obtained. A step fault near the input then manifests itself as a step change in the residual between the system inputs and the input trajectory obtained through inverse simulation. This approach avoids the need for additional hardware on a mass- and power-critical system such as the rover. The InvSim fault detection method is applied to a simple four-wheeled rover in simulation. Additive system faults and an external disturbance force and are applied to the vehicle in turn, such that the dynamic response and sensor output of the rover are impacted. Basic model-based fault detection is then employed to provide output residuals which may be analysed to provide information on the fault/disturbance. InvSim-based fault detection is then employed, similarly providing input residuals which provide further information on the fault/disturbance. The input residuals are shown to provide clearer information on the location and magnitude of an input fault than the output residuals. Additionally, they can allow faults to be more clearly discriminated from environmental disturbances.

Keywords: fault detection, ground robot, inverse simulation, rover

Procedia PDF Downloads 275
62 Residual Evaluation by Thresholding and Neuro-Fuzzy System: Application to Actuator

Authors: Y. Kourd, D. Lefebvre, N. Guersi

Abstract:

The monitoring of industrial processes is required to ensure operating conditions of industrial systems through automatic detection and isolation of faults. In this paper we propose a method of fault diagnosis based on neuro-fuzzy technique and the choice of a threshold. The validation of this method on a test bench "Actuator Electro DAMADICS Benchmark". In the first phase of the method, we construct a model represents the normal state of the system to fault detection. With residuals analysis generated and the choice of thresholds for signatures table. These signatures provide us with groups of non-detectable faults. In the second phase, we build faulty models to see the flaws in the system that are not located in the first phase.

Keywords: residuals analysis, threshold, neuro-fuzzy system, residual evaluation

Procedia PDF Downloads 413
61 Artificial Neural Networks with Decision Trees for Diagnosis Issues

Authors: Y. Kourd, D. Lefebvre, N. Guersi

Abstract:

This paper presents a new idea for fault detection and isolation (FDI) technique which is applied to industrial system. This technique is based on Neural Networks fault-free and Faulty behaviors Models (NNFM's). NNFM's are used for residual generation, while decision tree architecture is used for residual evaluation. The decision tree is realized with data collected from the NNFM’s outputs and is used to isolate detectable faults depending on computed threshold. Each part of the tree corresponds to specific residual. With the decision tree, it becomes possible to take the appropriate decision regarding the actual process behavior by evaluating few numbers of residuals. In comparison to usual systematic evaluation of all residuals, the proposed technique requires less computational effort and can be used for on line diagnosis. An application example is presented to illustrate and confirm the effectiveness and the accuracy of the proposed approach.

Keywords: neural networks, decision trees, diagnosis, behaviors

Procedia PDF Downloads 451
60 Faults Diagnosis by Thresholding and Decision tree with Neuro-Fuzzy System

Authors: Y. Kourd, D. Lefebvre

Abstract:

The monitoring of industrial processes is required to ensure operating conditions of industrial systems through automatic detection and isolation of faults. This paper proposes a method of fault diagnosis based on a neuro-fuzzy hybrid structure. This hybrid structure combines the selection of threshold and decision tree. The validation of this method is obtained with the DAMADICS benchmark. In the first phase of the method, a model will be constructed that represents the normal state of the system to fault detection. Signatures of the faults are obtained with residuals analysis and selection of appropriate thresholds. These signatures provide groups of non-separable faults. In the second phase, we build faulty models to see the flaws in the system that cannot be isolated in the first phase. In the latest phase we construct the tree that isolates these faults.

Keywords: decision tree, residuals analysis, ANFIS, fault diagnosis

Procedia PDF Downloads 587
59 Fault Diagnosis of Squirrel-Cage Induction Motor by a Neural Network Multi-Models

Authors: Yahia. Kourd, N. Guersi D. Lefebvre

Abstract:

In this paper we propose to study the faults diagnosis in squirrel-cage induction motor using MLP neural networks. We use neural healthy and faulty models of the behavior in order to detect and isolate some faults in machine. In the first part of this work, we have created a neural model for the healthy state using Matlab and a motor located in LGEB by acquirins data inputs and outputs of this engine. Then we detected the faults in the machine by residual generation. These residuals are not sufficient to isolate the existing faults. For this reason, we proposed additive neural networks to represent the faulty behaviors. From the analysis of these residuals and the choice of a threshold we propose a method capable of performing the detection and diagnosis of some faults in asynchronous machines with squirrel cage rotor.

Keywords: faults diagnosis, neural networks, multi-models, squirrel-cage induction motor

Procedia PDF Downloads 600
58 Competing Risks Modeling Using within Node Homogeneity Classification Tree

Authors: Kazeem Adesina Dauda, Waheed Babatunde Yahya

Abstract:

To design a tree that maximizes within-node homogeneity, there is a need for a homogeneity measure that is appropriate for event history data with multiple risks. We consider the use of Deviance and Modified Cox-Snell residuals as a measure of impurity in Classification Regression Tree (CART) and compare our results with the results of Fiona (2008) in which homogeneity measures were based on Martingale Residual. Data structure approach was used to validate the performance of our proposed techniques via simulation and real life data. The results of univariate competing risk revealed that: using Deviance and Cox-Snell residuals as a response in within node homogeneity classification tree perform better than using other residuals irrespective of performance techniques. Bone marrow transplant data and double-blinded randomized clinical trial, conducted in other to compare two treatments for patients with prostate cancer were used to demonstrate the efficiency of our proposed method vis-à-vis the existing ones. Results from empirical studies of the bone marrow transplant data showed that the proposed model with Cox-Snell residual (Deviance=16.6498) performs better than both the Martingale residual (deviance=160.3592) and Deviance residual (Deviance=556.8822) in both event of interest and competing risks. Additionally, results from prostate cancer also reveal the performance of proposed model over the existing one in both causes, interestingly, Cox-Snell residual (MSE=0.01783563) outfit both the Martingale residual (MSE=0.1853148) and Deviance residual (MSE=0.8043366). Moreover, these results validate those obtained from the Monte-Carlo studies.

Keywords: within-node homogeneity, Martingale residual, modified Cox-Snell residual, classification and regression tree

Procedia PDF Downloads 236
57 A Large Dataset Imputation Approach Applied to Country Conflict Prediction Data

Authors: Benjamin Leiby, Darryl Ahner

Abstract:

This study demonstrates an alternative stochastic imputation approach for large datasets when preferred commercial packages struggle to iterate due to numerical problems. A large country conflict dataset motivates the search to impute missing values well over a common threshold of 20% missingness. The methodology capitalizes on correlation while using model residuals to provide the uncertainty in estimating unknown values. Examination of the methodology provides insight toward choosing linear or nonlinear modeling terms. Static tolerances common in most packages are replaced with tailorable tolerances that exploit residuals to fit each data element. The methodology evaluation includes observing computation time, model fit, and the comparison of known values to replaced values created through imputation. Overall, the country conflict dataset illustrates promise with modeling first-order interactions while presenting a need for further refinement that mimics predictive mean matching.

Keywords: correlation, country conflict, imputation, stochastic regression

Procedia PDF Downloads 89
56 Studying the Photodegradation Behavior of Microplastics Released from Agricultural Plastic Products to the Farmland

Authors: Maryam Salehi, Gholamreza Bonyadinejad

Abstract:

The application of agricultural plastic products like mulch, greenhouse covers, and silage films is increasing due to their economic benefits in providing an early and better-quality harvest. In 2015, the 4 million tons (valued a 10.6 million USD) global market for agricultural plastic films was estimated to grow by 5.6% per year through 2030. Despite the short-term benefits provided by plastic products, their long-term sustainability issues and negative impacts on soil health are not well understood. After their removal from the field, some plastic residuals remain in the soil. Plastic residuals in farmlands may fragment to small particles called microplastics (d<5mm). The microplastics' exposure to solar radiation could alter their surface chemistry and make them susceptible to fragmentation. Thus, this study examined the photodegradation of low density polyethylene as the model microplastics that are released to the agriculture farmland. The variation of plastic’s surface chemistry, morphology, and bulk characteristics were studied after accelerated UV-A radiation experiments and sampling from an agricultural field. The Attenuated Total Reflectance Fourier Transform Spectroscopy (ATR-FTIR) and X-ray Photoelectron Spectroscopy (XPS) demonstrated the formation of oxidized surface functional groups onto the microplastics surface due to the photodegradation. The Differential Scanning Calorimetry (DSC) analysis revealed an increased crystallinity for the photodegraded microplastics compared to the new samples. The gel permeation chromatography (GPC) demonstrated the reduced molecular weight for the polymer due to the photodegradation. This study provides an important opportunity to advance understanding of soil pollution. Understanding the plastic residuals’ variations as they are left in the soil is providing a critical piece of information to better estimate the microplastics' impacts on environmental biodiversity, ecosystem sustainability, and food safety.

Keywords: soil health, plastic pollution, sustainability, photodegradation

Procedia PDF Downloads 191
55 Effect of Volcanic Ash and Recycled Aggregates in Concrete

Authors: Viviana Letelier, Ester Tarela, Giacomo Moriconi

Abstract:

The cement industry is responsible for around a 5% of the CO2 emissions worldwide and considering that concrete is one of the most used materials in construction its total effect is important. An alternative to reduce the environmental impact of concrete production is to incorporate certain amount of residuals in the dosing, limiting the replacement percentages to avoid significant losses in the mechanical properties of the final material. This study analyses the variation in the mechanical properties of structural concretes with recycled aggregates and volcanic ash as cement replacement to test the effect of the simultaneous use of different residuals in the same material. Analyzed concretes are dosed for a compressive strength of 30MPa. The recycled aggregates are obtained from prefabricated pipe debris with a compressive strength of 20MPa. The volcanic ash was obtained from the Ensenada (Chile) area after the Calbuco eruption in April 2015. The percentages of natural course aggregates that are replaced by recycled aggregates are of 0% and 30% and the percentages of cement replaced by volcanic ash are of 0%, 5%, 10% and 15%. The combined effect of both residuals in the mechanical properties of the concrete is evaluated through compressive strength tests after, 28 curing days, flexural strength tests after 28 days, and the elasticity modulus after 28 curing days. Results show that increasing the amount of volcanic ash used increases the losses in compressive strength. However, the use of up to a 5% of volcanic ash allows obtaining concretes with similar compressive strength to the control concrete, whether recycled aggregates are used or not. Furthermore, the pozzolanic reaction that occurs between the amorphous silica and the calcium hydroxide (Ca(OH)2) provokes an increase of a 10% in the compressive strength when a 5% of volcanic ash is combined with a 30% of recycled aggregates. Flexural strength does not show significant changes with neither of the residues. On the other hand, decreases between a 14% and a 25% in the elasticity modulus have been found. Concretes with up to a 30% of recycled aggregates and a 5% of volcanic ash as cement replacement can be produced without significant losses in their mechanical properties, reducing considerably the environmental impact of the final material.

Keywords: compressive strength of recycled concrete, mechanical properties of recycled concrete, recycled aggregates, volcanic ash as cement replacement

Procedia PDF Downloads 277
54 ARIMA-GARCH, A Statistical Modeling for Epileptic Seizure Prediction

Authors: Salman Mohamadi, Seyed Mohammad Ali Tayaranian Hosseini, Hamidreza Amindavar

Abstract:

In this paper, we provide a procedure to analyze and model EEG (electroencephalogram) signal as a time series using ARIMA-GARCH to predict an epileptic attack. The heteroskedasticity of EEG signal is examined through the ARCH or GARCH, (Autore- gressive conditional heteroskedasticity, Generalized autoregressive conditional heteroskedasticity) test. The best ARIMA-GARCH model in AIC sense is utilized to measure the volatility of the EEG from epileptic canine subjects, to forecast the future values of EEG. ARIMA-only model can perform prediction, but the ARCH or GARCH model acting on the residuals of ARIMA attains a con- siderable improved forecast horizon. First, we estimate the best ARIMA model, then different orders of ARCH and GARCH modelings are surveyed to determine the best heteroskedastic model of the residuals of the mentioned ARIMA. Using the simulated conditional variance of selected ARCH or GARCH model, we suggest the procedure to predict the oncoming seizures. The results indicate that GARCH modeling determines the dynamic changes of variance well before the onset of seizure. It can be inferred that the prediction capability comes from the ability of the combined ARIMA-GARCH modeling to cover the heteroskedastic nature of EEG signal changes.

Keywords: epileptic seizure prediction , ARIMA, ARCH and GARCH modeling, heteroskedasticity, EEG

Procedia PDF Downloads 374
53 Engineering Study on the Handling of Date Palm Fronds to Reduce Waste and Used as Energy Environmentally Friendly Fuel

Authors: Ayman H. Amer Eissa, Abdul Rahman O. Alghannam

Abstract:

The agricultural crop residuals are considered one of the most important problems faced by the environmental life and farmers in the world. A study was carried out to evaluate the physical characteristics of chopped date palm stalks (fronds and leaflets). These properties are necessary to apply normal design procedures such as pneumatic conveying, fluidization, drying, and combustion. The mechanical treatment by cutting, crushing or chopping and briquetting processes are the primary step and the suitable solution for solving this problem and recycling these residuals to be transformed into useful products. So the aim of the present work to get a high quality for agriculture residues such as date palm stalks (fronds), date palm leaflets briquettes. The results obtained from measuring the mechanical properties (average shear and compressive strength) for date palm stalks at different moisture content (12.63, 33.21 and 60.54%) was (6.4, 4.7 and 3.21MPa) and (3.8, 3.18 and 2.86MPa) respectively. The modulus of elasticity and toughness were evaluated as a function of moisture content. As the moisture content of the stalk regions increased the modulus of elasticity and toughness decreased indicating a reduction in the brittleness of the stalk regions. Chopped date palm stalks (palm fronds), date palm leaflets having moisture content of 8, 10 and 12% and 8, 10 and 12.8% w.b. were dandified into briquettes without binder and with binder (urea-formaldehyde) using a screw press machine. Quality properties for briquettes were durability, compression ratio hardness, bulk density, compression ratio, resiliency, water resistance and gases emission. The optimum quality properties found for briquettes at 8 % moisture content and without binder. Where the highest compression stress and durability were 8.95, 10.39 MPa and 97.06 %, 93.64 % for date palm stalks (palm fronds), date palm leaflets briquettes, respectively. The CO and CO2 emissions for date palm stalks (fronds), date palm leaflets briquettes were less than these for loose residuals.

Keywords: residues, date palm stalks, chopper, briquetting, quality properties

Procedia PDF Downloads 503
52 Derivation of Bathymetry from High-Resolution Satellite Images: Comparison of Empirical Methods through Geographical Error Analysis

Authors: Anusha P. Wijesundara, Dulap I. Rathnayake, Nihal D. Perera

Abstract:

Bathymetric information is fundamental importance to coastal and marine planning and management, nautical navigation, and scientific studies of marine environments. Satellite-derived bathymetry data provide detailed information in areas where conventional sounding data is lacking and conventional surveys are inaccessible. The two empirical approaches of log-linear bathymetric inversion model and non-linear bathymetric inversion model are applied for deriving bathymetry from high-resolution multispectral satellite imagery. This study compares these two approaches by means of geographical error analysis for the site Kankesanturai using WorldView-2 satellite imagery. Based on the Levenberg-Marquardt method calibrated the parameters of non-linear inversion model and the multiple-linear regression model was applied to calibrate the log-linear inversion model. In order to calibrate both models, Single Beam Echo Sounding (SBES) data in this study area were used as reference points. Residuals were calculated as the difference between the derived depth values and the validation echo sounder bathymetry data and the geographical distribution of model residuals was mapped. The spatial autocorrelation was calculated by comparing the performance of the bathymetric models and the results showing the geographic errors for both models. A spatial error model was constructed from the initial bathymetry estimates and the estimates of autocorrelation. This spatial error model is used to generate more reliable estimates of bathymetry by quantifying autocorrelation of model error and incorporating this into an improved regression model. Log-linear model (R²=0.846) performs better than the non- linear model (R²=0.692). Finally, the spatial error models improved bathymetric estimates derived from linear and non-linear models up to R²=0.854 and R²=0.704 respectively. The Root Mean Square Error (RMSE) was calculated for all reference points in various depth ranges. The magnitude of the prediction error increases with depth for both the log-linear and the non-linear inversion models. Overall RMSE for log-linear and the non-linear inversion models were ±1.532 m and ±2.089 m, respectively.

Keywords: log-linear model, multi spectral, residuals, spatial error model

Procedia PDF Downloads 272
51 The Evaluation of Gravity Anomalies Based on Global Models by Land Gravity Data

Authors: M. Yilmaz, I. Yilmaz, M. Uysal

Abstract:

The Earth system generates different phenomena that are observable at the surface of the Earth such as mass deformations and displacements leading to plate tectonics, earthquakes, and volcanism. The dynamic processes associated with the interior, surface, and atmosphere of the Earth affect the three pillars of geodesy: shape of the Earth, its gravity field, and its rotation. Geodesy establishes a characteristic structure in order to define, monitor, and predict of the whole Earth system. The traditional and new instruments, observables, and techniques in geodesy are related to the gravity field. Therefore, the geodesy monitors the gravity field and its temporal variability in order to transform the geodetic observations made on the physical surface of the Earth into the geometrical surface in which positions are mathematically defined. In this paper, the main components of the gravity field modeling, (Free-air and Bouguer) gravity anomalies are calculated via recent global models (EGM2008, EIGEN6C4, and GECO) over a selected study area. The model-based gravity anomalies are compared with the corresponding terrestrial gravity data in terms of standard deviation (SD) and root mean square error (RMSE) for determining the best fit global model in the study area at a regional scale in Turkey. The least SD (13.63 mGal) and RMSE (15.71 mGal) were obtained by EGM2008 for the Free-air gravity anomaly residuals. For the Bouguer gravity anomaly residuals, EIGEN6C4 provides the least SD (8.05 mGal) and RMSE (8.12 mGal). The results indicated that EIGEN6C4 can be a useful tool for modeling the gravity field of the Earth over the study area.

Keywords: free-air gravity anomaly, Bouguer gravity anomaly, global model, land gravity

Procedia PDF Downloads 137
50 Combining ASTER Thermal Data and Spatial-Based Insolation Model for Identification of Geothermal Active Areas

Authors: Khalid Hussein, Waleed Abdalati, Pakorn Petchprayoon, Khaula Alkaabi

Abstract:

In this study, we integrated ASTER thermal data with an area-based spatial insolation model to identify and delineate geothermally active areas in Yellowstone National Park (YNP). Two pairs of L1B ASTER day- and nighttime scenes were used to calculate land surface temperature. We employed the Emissivity Normalization Algorithm which separates temperature from emissivity to calculate surface temperature. We calculated the incoming solar radiation for the area covered by each of the four ASTER scenes using an insolation model and used this information to compute temperature due to solar radiation. We then identified the statistical thermal anomalies using land surface temperature and the residuals calculated from modeled temperatures and ASTER-derived surface temperatures. Areas that had temperatures or temperature residuals greater than 2σ and between 1σ and 2σ were considered ASTER-modeled thermal anomalies. The areas identified as thermal anomalies were in strong agreement with the thermal areas obtained from the YNP GIS database. Also the YNP hot springs and geysers were located within areas identified as anomalous thermal areas. The consistency between our results and known geothermally active areas indicate that thermal remote sensing data, integrated with a spatial-based insolation model, provides an effective means for identifying and locating areas of geothermal activities over large areas and rough terrain.

Keywords: thermal remote sensing, insolation model, land surface temperature, geothermal anomalies

Procedia PDF Downloads 333
49 Waste Analysis and Classification Study (WACS) in Ecotourism Sites of Samal Island, Philippines Towards a Circular Economy Perspective

Authors: Reeden Bicomong

Abstract:

Ecotourism activities, though geared towards conservation efforts, still put pressures against the natural state of the environment. Influx of visitors that goes beyond carrying capacity of the ecotourism site, the wastes generated, greenhouse gas emissions, are just few of the potential negative impacts of a not well-managed ecotourism activities. According to Girard and Nocca (2017) tourism produces many negative impacts because it is configured according to the model of linear economy, operating on a linear model of take, make and dispose (Ellen MacArthur Foundation 2015). With the influx of tourists in an ecotourism area, more wastes are generated, and if unregulated, natural state of the environment will be at risk. It is in this light that a study on waste analysis and classification study in five different ecotourism sites of Samal Island, Philippines was conducted. The major objective of the study was to analyze the amount and content of wastes generated from ecotourism sites in Samal Island, Philippines and make recommendations based on the circular economy perspective. Five ecotourism sites in Samal Island, Philippines was identified such as Hagimit Falls, Sanipaan Vanishing Shoal, Taklobo Giant Clams, Monfort Bat Cave, and Tagbaobo Community Based Ecotourism. Ocular inspection of each ecotourism site was conducted. Likewise, key informant interview of ecotourism operators and staff was done. Wastes generated from these ecotourism sites were analyzed and characterized to come up with recommendations that are based on the concept of circular economy. Wastes generated were classified into biodegradables, recyclables, residuals and special wastes. Regression analysis was conducted to determine if increase in number of visitors would equate to increase in the amount of wastes generated. Ocular inspection indicated that all of the five ecotourism sites have their own system of waste collection. All of the sites inspected were found to be conducting waste separation at source since there are different types of garbage bins for all of the four classification of wastes such as biodegradables, recyclables, residuals and special wastes. Furthermore, all five ecotourism sites practice composting of biodegradable wastes and recycling of recyclables. Therefore, only residuals are being collected by the municipal waste collectors. Key informant interview revealed that all five ecotourism sites offer mostly nature based activities such as swimming, diving, site seeing, bat watching, rice farming experiences and community living. Among the five ecotourism sites, Sanipaan Vanishing Shoal has the highest average number of visitors in a weekly basis. At the same time, in the wastes assessment study conducted, Sanipaan has the highest amount of wastes generated. Further results of wastes analysis revealed that biodegradables constitute majority of the wastes generated in all of the five selected ecotourism sites. Meanwhile, special wastes proved to be the least generated as there was no amount of this type was observed during the three consecutive weeks WACS was conducted.

Keywords: Circular economy, ecotourism, sustainable development, WACS

Procedia PDF Downloads 175
48 Valorization of Argan Residuals for the Treatment of Industrial Effluents

Authors: Salim Ahmed

Abstract:

The aim of this study was to recover a natural residue in the form of activated carbon prepared from Moroccan "argan pits and date pits" plant waste. After preparing the raw material for manufacture, the carbon was carbonised at 300°C and chemically activated with phosphoric acid of purity 85. The various characterisation results (moisture and ash content, specific surface area, pore volume, etc.) showed that the carbons obtained are comparable to those manufactured industrially and could therefore be tested, for example, in water treatment processes and especially for the depollution of effluents used in the agri-food and textile industries.

Keywords: activated carbon, water treatment, adsorption, argan

Procedia PDF Downloads 34
47 Spectral Analysis Applied to Variables of Oil Wells Profiling

Authors: Suzana Leitão Russo, Mayara Laysa de Oliveira Silva, José Augusto Andrade Filho, Vitor Hugo Simon

Abstract:

Currently, seismic methods and prospecting methods are commonly applied in the oil industry and, according to the information reported every day; oil is a source of non-renewable energy. It is easier to understand why the ownership of areas of oil extraction is coveted by many nations. It is necessary to think about ways that will enable the maximization of oil production. The technique of spectral analysis can be used to analyze the behavior of the variables already defined in oil well the profile. The main objective is to verify the series dependence of variables, and to model the variables using the frequency domain to observe the model residuals.

Keywords: oil, well, spectral analysis, oil extraction

Procedia PDF Downloads 499
46 Logistic Regression Model versus Additive Model for Recurrent Event Data

Authors: Entisar A. Elgmati

Abstract:

Recurrent infant diarrhea is studied using daily data collected in Salvador, Brazil over one year and three months. A logistic regression model is fitted instead of Aalen's additive model using the same covariates that were used in the analysis with the additive model. The model gives reasonably similar results to that using additive regression model. In addition, the problem with the estimated conditional probabilities not being constrained between zero and one in additive model is solved here. Also martingale residuals that have been used to judge the goodness of fit for the additive model are shown to be useful for judging the goodness of fit of the logistic model.

Keywords: additive model, cumulative probabilities, infant diarrhoea, recurrent event

Procedia PDF Downloads 603
45 Medial Axis Analysis of Valles Marineris

Authors: Dan James

Abstract:

The Medial Axis of the Main Canyon of Valles Marineris is determined geometrically with maximally inscribed discs aligned with the boundaries or rims of the Main Canyon. Inscribed discs are placed at evenly spaced longitude intervals and, using the radius function, the locus of the centre of all discs is determined, together with disc centre co-ordinates. These centre co-ordinates result in arrays of x, y co-ordinates which are curve fitted to a Sinusoidal function and residuals appropriate for nonlinear regression are evaluated using the R-squared value (R2) and the Root Mean Squared Error (RMSE). This evaluation demonstrates that a Sinusoidal Curve closely fits to the co-ordinate data

Keywords: medial axis, MAT, valles marineris, sinusoidal

Procedia PDF Downloads 68
44 Behind Fuzzy Regression Approach: An Exploration Study

Authors: Lavinia B. Dulla

Abstract:

The exploration study of the fuzzy regression approach attempts to present that fuzzy regression can be used as a possible alternative to classical regression. It likewise seeks to assess the differences and characteristics of simple linear regression and fuzzy regression using the width of prediction interval, mean absolute deviation, and variance of residuals. Based on the simple linear regression model, the fuzzy regression approach is worth considering as an alternative to simple linear regression when the sample size is between 10 and 20. As the sample size increases, the fuzzy regression approach is not applicable to use since the assumption regarding large sample size is already operating within the framework of simple linear regression. Nonetheless, it can be suggested for a practical alternative when decisions often have to be made on the basis of small data.

Keywords: fuzzy regression approach, minimum fuzziness criterion, interval regression, prediction interval

Procedia PDF Downloads 254
43 Robust Diagnosis Efficiency by Bond-Graph Approach

Authors: Benazzouz Djamel, Termeche Adel, Touati Youcef, Alem Said, Ouziala Mahdi

Abstract:

This paper presents an approach which detect and isolate efficiently a fault in a system. This approach avoids false alarms, non-detections and delays in detecting faults. A study case have been proposed to show the importance of taking into consideration the uncertainties in the decision-making procedure and their effect on the degradation diagnostic performance and advantage of using Bond Graph (BG) for such degradation. The use of BG in the Linear Fractional Transformation (LFT) form allows generating robust Analytical Redundancy Relations (ARR’s), where the uncertain part of ARR’s is used to generate the residuals adaptive thresholds. The study case concerns an electromechanical system composed of a motor, a reducer and an external load. The aim of this application is to show the effectiveness of the BG-LFT approach to robust fault detection.

Keywords: bond graph, LFT, uncertainties, detection and faults isolation, ARR

Procedia PDF Downloads 277
42 Least Squares Method Identification of Corona Current-Voltage Characteristics and Electromagnetic Field in Electrostatic Precipitator

Authors: H. Nouri, I. E. Achouri, A. Grimes, H. Ait Said, M. Aissou, Y. Zebboudj

Abstract:

This paper aims to analysis the behaviour of DC corona discharge in wire-to-plate electrostatic precipitators (ESP). Current-voltage curves are particularly analysed. Experimental results show that discharge current is strongly affected by the applied voltage. The proposed method of current identification is to use the method of least squares. Least squares problems that of into two categories: linear or ordinary least squares and non-linear least squares, depending on whether or not the residuals are linear in all unknowns. The linear least-squares problem occurs in statistical regression analysis; it has a closed-form solution. A closed-form solution (or closed form expression) is any formula that can be evaluated in a finite number of standard operations. The non-linear problem has no closed-form solution and is usually solved by iterative.

Keywords: electrostatic precipitator, current-voltage characteristics, least squares method, electric field, magnetic field

Procedia PDF Downloads 400
41 A Geographic Information System Mapping Method for Creating Improved Satellite Solar Radiation Dataset Over Qatar

Authors: Sachin Jain, Daniel Perez-Astudillo, Dunia A. Bachour, Antonio P. Sanfilippo

Abstract:

The future of solar energy in Qatar is evolving steadily. Hence, high-quality spatial solar radiation data is of the uttermost requirement for any planning and commissioning of solar technology. Generally, two types of solar radiation data are available: satellite data and ground observations. Satellite solar radiation data is developed by the physical and statistical model. Ground data is collected by solar radiation measurement stations. The ground data is of high quality. However, they are limited to distributed point locations with the high cost of installation and maintenance for the ground stations. On the other hand, satellite solar radiation data is continuous and available throughout geographical locations, but they are relatively less accurate than ground data. To utilize the advantage of both data, a product has been developed here which provides spatial continuity and higher accuracy than any of the data alone. The popular satellite databases: National Solar radiation Data Base, NSRDB (PSM V3 model, spatial resolution: 4 km) is chosen here for merging with ground-measured solar radiation measurement in Qatar. The spatial distribution of ground solar radiation measurement stations is comprehensive in Qatar, with a network of 13 ground stations. The monthly average of the daily total Global Horizontal Irradiation (GHI) component from ground and satellite data is used for error analysis. The normalized root means square error (NRMSE) values of 3.31%, 6.53%, and 6.63% for October, November, and December 2019 were observed respectively when comparing in-situ and NSRDB data. The method is based on the Empirical Bayesian Kriging Regression Prediction model available in ArcGIS, ESRI. The workflow of the algorithm is based on the combination of regression and kriging methods. A regression model (OLS, ordinary least square) is fitted between the ground and NSBRD data points. A semi-variogram is fitted into the experimental semi-variogram obtained from the residuals. The kriging residuals obtained after fitting the semi-variogram model were added to NSRBD data predicted values obtained from the regression model to obtain the final predicted values. The NRMSE values obtained after merging are respectively 1.84%, 1.28%, and 1.81% for October, November, and December 2019. One more explanatory variable, that is the ground elevation, has been incorporated in the regression and kriging methods to reduce the error and to provide higher spatial resolution (30 m). The final GHI maps have been created after merging, and NRMSE values of 1.24%, 1.28%, and 1.28% have been observed for October, November, and December 2019, respectively. The proposed merging method has proven as a highly accurate method. An additional method is also proposed here to generate calibrated maps by using regression and kriging model and further to use the calibrated model to generate solar radiation maps from the explanatory variable only when not enough historical ground data is available for long-term analysis. The NRMSE values obtained after the comparison of the calibrated maps with ground data are 5.60% and 5.31% for November and December 2019 month respectively.

Keywords: global horizontal irradiation, GIS, empirical bayesian kriging regression prediction, NSRDB

Procedia PDF Downloads 58
40 Fault Detection and Isolation of a Three-Tank System using Analytical Temporal Redundancy, Parity Space/Relation Based Residual Generation

Authors: A. T. Kuda, J. J. Dayya, A. Jimoh

Abstract:

This paper investigates the fault detection and Isolation technique of measurement data sets from a three tank system using analytical model-based temporal redundancy which is based on residual generation using parity equations/space approach. It further briefly outlines other approaches of model-based residual generation. The basic idea of parity space residual generation in temporal redundancy is dynamic relationship between sensor outputs and actuator inputs (input-output model). These residuals where then used to detect whether or not the system is faulty and indicate the location of the fault when it is faulty. The method obtains good results by detecting and isolating faults from the considered data sets measurements generated from the system.

Keywords: fault detection, fault isolation, disturbing influences, system failure, parity equation/relation, structured parity equations

Procedia PDF Downloads 269
39 Optimal Hedging of a Portfolio of European Options in an Extended Binomial Model under Proportional Transaction Costs

Authors: Norm Josephy, Lucy Kimball, Victoria Steblovskaya

Abstract:

Hedging of a portfolio of European options under proportional transaction costs is considered. Our discrete time financial market model extends the binomial market model with transaction costs to the case where the underlying stock price ratios are distributed over a bounded interval rather than over a two-point set. An optimal hedging strategy is chosen from a set of admissible non-self-financing hedging strategies. Our approach to optimal hedging of a portfolio of options is based on theoretical foundation that includes determination of a no-arbitrage option price interval as well as on properties of the non-self-financing strategies and their residuals. A computational algorithm for optimizing an investor relevant criterion over the set of admissible non-self-financing hedging strategies is developed. Applicability of our approach is demonstrated using both simulated data and real market data.

Keywords: extended binomial model, non-self-financing hedging, optimization, proportional transaction costs

Procedia PDF Downloads 223
38 Forecasting Electricity Spot Price with Generalized Long Memory Modeling: Wavelet and Neural Network

Authors: Souhir Ben Amor, Heni Boubaker, Lotfi Belkacem

Abstract:

This aims of this paper is to forecast the electricity spot prices. First, we focus on modeling the conditional mean of the series so we adopt a generalized fractional -factor Gegenbauer process (k-factor GARMA). Secondly, the residual from the -factor GARMA model has used as a proxy for the conditional variance; these residuals were predicted using two different approaches. In the first approach, a local linear wavelet neural network model (LLWNN) has developed to predict the conditional variance using the Back Propagation learning algorithms. In the second approach, the Gegenbauer generalized autoregressive conditional heteroscedasticity process (G-GARCH) has adopted, and the parameters of the k-factor GARMA-G-GARCH model has estimated using the wavelet methodology based on the discrete wavelet packet transform (DWPT) approach. The empirical results have shown that the k-factor GARMA-G-GARCH model outperform the hybrid k-factor GARMA-LLWNN model, and find it is more appropriate for forecasts.

Keywords: electricity price, k-factor GARMA, LLWNN, G-GARCH, forecasting

Procedia PDF Downloads 201
37 Fluid Structure Interaction of Flow and Heat Transfer around a Microcantilever

Authors: Khalil Khanafer

Abstract:

This study emphasizes on analyzing the effect of flow conditions and the geometric variation of the microcantilever’s bluff body on the microcantilever detection capabilities within a fluidic device using a finite element fluid-structure interaction model. Such parameters include inlet velocity, flow direction, and height of the microcantilever’s supporting system within the fluidic cell. The transport equations are solved using a finite element formulation based on the Galerkin method of weighted residuals. For a flexible microcantilever, a fully coupled fluid-structure interaction (FSI) analysis is utilized and the fluid domain is described by an Arbitrary-Lagrangian–Eulerian (ALE) formulation that is fully coupled to the structure domain. The results of this study showed a profound effect on the magnitude and direction of the inlet velocity and the height of the bluff body on the deflection of the microcantilever. The vibration characteristics were also investigated in this study. This work paves the road for researchers to design efficient microcantilevers that display least errors in the measurements.

Keywords: fluidic cell, FSI, microcantilever, flow direction

Procedia PDF Downloads 349