Search results for: key frame extraction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2930

Search results for: key frame extraction

2150 Recovery of Copper and Gold by Delamination of Printed Circuit Boards Followed by Leaching and Solvent Extraction Process

Authors: Kamalesh Kumar Singh

Abstract:

Due to increasing trends of electronic waste, specially the ICT related gadgets, their green recycling is still a greater challenge. This article presents a two-stage, eco-friendly hydrometallurgical route for the recovery of gold from the delaminated metallic layers of waste mobile phone Printed Circuit Boards (PCBs). Initially, mobile phone PCBs are downsized (1x1 cm²) and treated with an organic solvent dimethylacetamide (DMA) for the separation of metallic fraction from non-metallic glass fiber. In the first stage, liberated metallic sheets are used for the selective dissolution of copper in an aqueous leaching reagent. Influence of various parameters such as type of leaching reagent, the concentration of the solution, temperature, time and pulp density are optimized for the effective leaching (almost 100%) of copper. Results have shown that 3M nitric acid is a suitable reagent for copper leaching at room temperature and considering chemical features, gold remained in solid residue. In the second stage, the separated residue is used for the recovery of gold by using sulphuric acid with a combination of halide salt. In this halide leaching, Cl₂ or Br₂ is generated as an in-situ oxidant to improve the leaching of gold. Results have shown that almost 92 % of gold is recovered at the optimized parameters.

Keywords: printed circuit boards, delamination, leaching, solvent extraction, recovery

Procedia PDF Downloads 57
2149 Uplift Segmentation Approach for Targeting Customers in a Churn Prediction Model

Authors: Shivahari Revathi Venkateswaran

Abstract:

Segmenting customers plays a significant role in churn prediction. It helps the marketing team with proactive and reactive customer retention. For the reactive retention, the retention team reaches out to customers who already showed intent to disconnect by giving some special offers. When coming to proactive retention, the marketing team uses churn prediction model, which ranks each customer from rank 1 to 100, where 1 being more risk to churn/disconnect (high ranks have high propensity to churn). The churn prediction model is built by using XGBoost model. However, with the churn rank, the marketing team can only reach out to the customers based on their individual ranks. To profile different groups of customers and to frame different marketing strategies for targeted groups of customers are not possible with the churn ranks. For this, the customers must be grouped in different segments based on their profiles, like demographics and other non-controllable attributes. This helps the marketing team to frame different offer groups for the targeted audience and prevent them from disconnecting (proactive retention). For segmentation, machine learning approaches like k-mean clustering will not form unique customer segments that have customers with same attributes. This paper finds an alternate approach to find all the combination of unique segments that can be formed from the user attributes and then finds the segments who have uplift (churn rate higher than the baseline churn rate). For this, search algorithms like fast search and recursive search are used. Further, for each segment, all customers can be targeted using individual churn ranks from the churn prediction model. Finally, a UI (User Interface) is developed for the marketing team to interactively search for the meaningful segments that are formed and target the right set of audience for future marketing campaigns and prevent them from disconnecting.

Keywords: churn prediction modeling, XGBoost model, uplift segments, proactive marketing, search algorithms, retention, k-mean clustering

Procedia PDF Downloads 71
2148 Dynamic Test for Sway-Mode Buckling of Columns

Authors: Boris Blostotsky, Elia Efraim

Abstract:

Testing of columns in sway mode is performed in order to determine the maximal allowable load limited by plastic deformations or their end connections and a critical load limited by columns stability. Motivation to determine accurate value of critical force is caused by its using as follow: - critical load is maximal allowable load for given column configuration and can be used as criterion of perfection; - it is used in calculation prescribed by standards for design of structural elements under combined action of compression and bending; - it is used for verification of theoretical analysis of stability at various end conditions of columns. In the present work a new non-destructive method for determination of columns critical buckling load in sway mode is proposed. The method allows performing measurements during the tests under loads that exceeds the columns critical load without losing its stability. The possibility of such loading is achieved by structure of the loading system. The system is performed as frame with rigid girder, one of the columns is the tested column and the other is additional two-hinged strut. Loading of the frame is carried out by the flexible traction element attached to the girder. The load applied on the tested column can achieve a values that exceed the critical load by choice of parameters of the traction element and the additional strut. The system lateral stiffness and the column critical load are obtained by the dynamic method. The experiment planning and the comparison between the experimental and theoretical values were performed based on the developed dependency of lateral stiffness of the system on vertical load, taking into account a semi-rigid connections of the column's ends. The agreement between the obtained results was established. The method can be used for testing of real full-size columns in industrial conditions.

Keywords: buckling, columns, dynamic method, semi-rigid connections, sway mode

Procedia PDF Downloads 313
2147 Determination of the Axial-Vector from an Extended Linear Sigma Model

Authors: Tarek Sayed Taha Ali

Abstract:

The dependence of the axial-vector coupling constant gA on the quark masses has been investigated in the frame work of the extended linear sigma model. The field equations have been solved in the mean-field approximation. Our study shows a better fitting to the experimental data compared with the existing models.

Keywords: extended linear sigma model, nucleon properties, axial coupling constant, physic

Procedia PDF Downloads 446
2146 Feasibility of Chicken Feather Waste as a Renewable Resource for Textile Dyeing Processes

Authors: Belayihun Missaw

Abstract:

Cotton cationization is an emerging area that solves the environmental problems associated with the reactive dyeing of cotton. In this study, keratin hydrolysate cationizing agent from chicken feather was extracted and optimized to eliminate the usage of salt during dyeing. Cationization of cotton using the extracted keratin hydrolysate and dyeing of the cationized cotton without salt was made. The effect of extraction parametric conditions like concentration of caustic soda, temperature and time were studied on the yield of protein from chicken feather and colour strength (K/S) values, and these process conditions were optimized. The optimum extraction conditions were. 25g/l caustic soda, at 500C temperature and 105 minutes with average yield = 91.2% and 4.32 colour strength value. The effect of salt addition, pH and concentration of cationizing agent on yield colour strength was also studied and optimized. It was observed that slightly acidic condition with 4% (% owf) concentration of cationizing agent gives a better dyeability as compared to normal cotton reactive dyeing. The physical properties of cationized-dyed fabric were assessed, and the result reveals that the cationization has a similar effect as normal dyeing of cotton. The cationization of cotton with keratin extract was found to be successful and economically viable.

Keywords: cotton materials, cationization, reactive dye, keratin hydrolysate

Procedia PDF Downloads 63
2145 An Automated Optimal Robotic Assembly Sequence Planning Using Artificial Bee Colony Algorithm

Authors: Balamurali Gunji, B. B. V. L. Deepak, B. B. Biswal, Amrutha Rout, Golak Bihari Mohanta

Abstract:

Robots play an important role in the operations like pick and place, assembly, spot welding and much more in manufacturing industries. Out of those, assembly is a very important process in manufacturing, where 20% of manufacturing cost is wholly occupied by the assembly process. To do the assembly task effectively, Assembly Sequences Planning (ASP) is required. ASP is one of the multi-objective non-deterministic optimization problems, achieving the optimal assembly sequence involves huge search space and highly complex in nature. Many researchers have followed different algorithms to solve ASP problem, which they have several limitations like the local optimal solution, huge search space, and execution time is more, complexity in applying the algorithm, etc. By keeping the above limitations in mind, in this paper, a new automated optimal robotic assembly sequence planning using Artificial Bee Colony (ABC) Algorithm is proposed. In this algorithm, automatic extraction of assembly predicates is done using Computer Aided Design (CAD) interface instead of extracting the assembly predicates manually. Due to this, the time of extraction of assembly predicates to obtain the feasible assembly sequence is reduced. The fitness evaluation of the obtained feasible sequence is carried out using ABC algorithm to generate the optimal assembly sequence. The proposed methodology is applied to different industrial products and compared the results with past literature.

Keywords: assembly sequence planning, CAD, artificial Bee colony algorithm, assembly predicates

Procedia PDF Downloads 237
2144 Laser Data Based Automatic Generation of Lane-Level Road Map for Intelligent Vehicles

Authors: Zehai Yu, Hui Zhu, Linglong Lin, Huawei Liang, Biao Yu, Weixin Huang

Abstract:

With the development of intelligent vehicle systems, a high-precision road map is increasingly needed in many aspects. The automatic lane lines extraction and modeling are the most essential steps for the generation of a precise lane-level road map. In this paper, an automatic lane-level road map generation system is proposed. To extract the road markings on the ground, the multi-region Otsu thresholding method is applied, which calculates the intensity value of laser data that maximizes the variance between background and road markings. The extracted road marking points are then projected to the raster image and clustered using a two-stage clustering algorithm. Lane lines are subsequently recognized from these clusters by the shape features of their minimum bounding rectangle. To ensure the storage efficiency of the map, the lane lines are approximated to cubic polynomial curves using a Bayesian estimation approach. The proposed lane-level road map generation system has been tested on urban and expressway conditions in Hefei, China. The experimental results on the datasets show that our method can achieve excellent extraction and clustering effect, and the fitted lines can reach a high position accuracy with an error of less than 10 cm.

Keywords: curve fitting, lane-level road map, line recognition, multi-thresholding, two-stage clustering

Procedia PDF Downloads 128
2143 (De)Motivating Mitigation Behavior: An Exploratory Framing Study Applied to Sustainable Food Consumption

Authors: Youval Aberman, Jason E. Plaks

Abstract:

This research provides initial evidence that self-efficacy of mitigation behavior – the belief that one’s action can make a difference on the environment – can be implicitly inferred from the way numerical information is presented in environmental messages. The scientific community sees climate change as a pressing issue, but the general public tends to construe climate change as an abstract phenomenon that is psychologically distant. As such, a main barrier to pro-environmental behavior is that individuals often believe that their own behavior makes little to no difference on the environment. When it comes to communicating how the behavior of billions of individuals affects global climate change, it might appear valuable to aggregate those billions and present the shocking enormity of the resources individuals consume. This research provides initial evidence that, in fact, this strategy is ineffective; presenting large-scale aggregate data dilutes the contribution of the individual and impedes individuals’ motivation to act pro-environmentally. The high-impact, underrepresented behavior of eating a sustainable diet was chosen for the present studies. US Participants (total N = 668) were recruited online for a study on ‘meat and the environment’ and received information about some of resources used in meat production – water, CO2e, and feed – with numerical information that varied in its frame of reference. A ‘Nation’ frame of reference discussed the resources used in the beef industry, such as the billions of CO2e released daily by the industry, while a ‘Meal’ frame of reference presented the resources used in the production of a single beef dish. Participants completed measures of pro-environmental attitudes and behavioral intentions, either immediately (Study 1) or two days (Study 2) after reading the information. In Study 2 (n = 520) participants also indicated whether they consumed less or more meat than usual. Study 2 included an additional control condition that contained no environmental data. In Study 1, participants who read about meat production at a national level, compared to at a meal level, reported lower motivation to make ecologically conscious dietary choices and reported lower behavioral intention to change their diet. In Study 2, a similar pattern emerged, with the added insight that the Nation condition, but not the Meal condition, deviated from the control condition. Participants across conditions, on average, reduced their meat consumption in the duration of Study 2, except those in the Nation condition who remained unchanged. Presenting nation-wide consequences of human behavior is a double-edged sword: Framing in a large scale might reveal the relationship between collective actions and environmental issues, but it hinders the belief that individual actions make a difference.

Keywords: climate change communication, environmental concern, meat consumption, motivation

Procedia PDF Downloads 158
2142 An Intelligent Text Independent Speaker Identification Using VQ-GMM Model Based Multiple Classifier System

Authors: Ben Soltane Cheima, Ittansa Yonas Kelbesa

Abstract:

Speaker Identification (SI) is the task of establishing identity of an individual based on his/her voice characteristics. The SI task is typically achieved by two-stage signal processing: training and testing. The training process calculates speaker specific feature parameters from the speech and generates speaker models accordingly. In the testing phase, speech samples from unknown speakers are compared with the models and classified. Even though performance of speaker identification systems has improved due to recent advances in speech processing techniques, there is still need of improvement. In this paper, a Closed-Set Tex-Independent Speaker Identification System (CISI) based on a Multiple Classifier System (MCS) is proposed, using Mel Frequency Cepstrum Coefficient (MFCC) as feature extraction and suitable combination of vector quantization (VQ) and Gaussian Mixture Model (GMM) together with Expectation Maximization algorithm (EM) for speaker modeling. The use of Voice Activity Detector (VAD) with a hybrid approach based on Short Time Energy (STE) and Statistical Modeling of Background Noise in the pre-processing step of the feature extraction yields a better and more robust automatic speaker identification system. Also investigation of Linde-Buzo-Gray (LBG) clustering algorithm for initialization of GMM, for estimating the underlying parameters, in the EM step improved the convergence rate and systems performance. It also uses relative index as confidence measures in case of contradiction in identification process by GMM and VQ as well. Simulation results carried out on voxforge.org speech database using MATLAB highlight the efficacy of the proposed method compared to earlier work.

Keywords: feature extraction, speaker modeling, feature matching, Mel frequency cepstrum coefficient (MFCC), Gaussian mixture model (GMM), vector quantization (VQ), Linde-Buzo-Gray (LBG), expectation maximization (EM), pre-processing, voice activity detection (VAD), short time energy (STE), background noise statistical modeling, closed-set tex-independent speaker identification system (CISI)

Procedia PDF Downloads 309
2141 Estimation of Forces Applied to Forearm Using EMG Signal Features to Control of Powered Human Arm Prostheses

Authors: Faruk Ortes, Derya Karabulut, Yunus Ziya Arslan

Abstract:

Myoelectric features gathering from musculature environment are considered on a preferential basis to perceive muscle activation and control human arm prostheses according to recent experimental researches. EMG (electromyography) signal based human arm prostheses have shown a promising performance in terms of providing basic functional requirements of motions for the amputated people in recent years. However, these assistive devices for neurorehabilitation still have important limitations in enabling amputated people to perform rather sophisticated or functional movements. Surface electromyogram (EMG) is used as the control signal to command such devices. This kind of control consists of activating a motion in prosthetic arm using muscle activation for the same particular motion. Extraction of clear and certain neural information from EMG signals plays a major role especially in fine control of hand prosthesis movements. Many signal processing methods have been utilized for feature extraction from EMG signals. The specific objective of this study was to compare widely used time domain features of EMG signal including integrated EMG(IEMG), root mean square (RMS) and waveform length(WL) for prediction of externally applied forces to human hands. Obtained features were classified using artificial neural networks (ANN) to predict the forces. EMG signals supplied to process were recorded during only type of muscle contraction which is isometric and isotonic one. Experiments were performed by three healthy subjects who are right-handed and in a range of 25-35 year-old aging. EMG signals were collected from muscles of the proximal part of the upper body consisting of: biceps brachii, triceps brachii, pectorialis major and trapezius. The force prediction results obtained from the ANN were statistically analyzed and merits and pitfalls of the extracted features were discussed with detail. The obtained results are anticipated to contribute classification process of EMG signal and motion control of powered human arm prosthetics control.

Keywords: assistive devices for neurorehabilitation, electromyography, feature extraction, force estimation, human arm prosthesis

Procedia PDF Downloads 367
2140 Time, Uncertainty, and Technological Innovation

Authors: Xavier Everaert

Abstract:

Ever since the publication of “The Problem of Social” cost, Coasean insights on externalities, transaction costs, and the reciprocal nature of harms, have been widely debated. What has been largely neglected however, is the role of technological innovation in the mitigation of negative externalities or transaction costs. Incorporating future uncertainty about negligence standards or expected restitution costs and the profit opportunities these uncertainties reveal to entrepreneurs, allow us to frame problems regarding social costs within the reality of rapid technological evolution.

Keywords: environmental law and economics, entrepreneurship, commons, pollution, wildlife

Procedia PDF Downloads 421
2139 Introduction of Artificial Intelligence for Estimating Fractal Dimension and Its Applications in the Medical Field

Authors: Zerroug Abdelhamid, Danielle Chassoux

Abstract:

Various models are given to simulate homogeneous or heterogeneous cancerous tumors and extract in each case the boundary. The fractal dimension is then estimated by least squares method and compared to some previous methods.

Keywords: simulation, cancerous tumor, Markov fields, fractal dimension, extraction, recovering

Procedia PDF Downloads 365
2138 To Study the Effect of Drying Temperature Towards Extraction of Aquilaria subintegra Dry Leaves Using Vacuum Far Infrared

Authors: Tengku Muhammad Rafi Nazmi Bin Tengku Razali, Habsah Alwi

Abstract:

This article based on effect of temperature towards extraction of Aquilaria Subintegra. Aquilaria Subintegra which its main habitat is in Asia-tropical and particularly often found in its native which is Thailand. There is claim which is Aquilaria Subintegra contains antipyretic properties that helps fight fever. Research nowadays also shown that paracetamol consumed bring bad effect towards consumers. This sample will first dry using Vacuum Far Infrared which provides better drying than conventional oven. Soxhlet extractor used to extract oil from sample. Gas Chromatography Mass Spectrometer used to analyze sample to determine its compound. Objective from this research was to determine the active ingredients that exist in the Aquilaria Subintegra leaves and to determine whether compound of Acetaminophen exist or not inside the leaves. Moisture content from 400C was 80%, 500C was 620% and 600C was 36%. The greater temperature resulting lower moisture content inside sample leaves. 7 components were identified in sample T=400C while only 5 components were identified in sample at T=50C and T=60C. Four components were commonly identified in three sample which is 1n-Hexadecanoic acid, 9,12,15-Octadecatrienoic acid, methyl ester (z,z,z), Vitamin E and Squalene. Further studies are needed with new series of temperature to refine the best results.

Keywords: aquilaria subintegra, vacuum far infrared, SOXHLET extractor, gas chromatography mass spectrometer, paracetamol

Procedia PDF Downloads 484
2137 Modern Seismic Design Approach for Buildings with Hysteretic Dampers

Authors: Vanessa A. Segovia, Sonia E. Ruiz

Abstract:

The use of energy dissipation systems for seismic applications has increased worldwide, thus it is necessary to develop practical and modern criteria for their optimal design. Here, a direct displacement-based seismic design approach for frame buildings with hysteretic energy dissipation systems (HEDS) is applied. The building is constituted by two individual structural systems consisting of: 1) A main elastic structural frame designed for service loads and 2) A secondary system, corresponding to the HEDS, that controls the effects of lateral loads. The procedure implies to control two design parameters: A) The stiffness ratio (α=K_frame/K_(total system)), and B) The strength ratio (γ= V_damper / V_(total system)). The proposed damage-controlled approach contributes to the design of a more sustainable and resilient building because the structural damage is concentrated on the HEDS. The reduction of the design displacement spectrum is done by means of a damping factor (recently published) for elastic structural systems with HEDS, located in Mexico City. Two limit states are verified: Serviceability and near collapse. Instead of the traditional trial-error approach, a procedure that allows the designer to establish the preliminary sizes of the structural elements of both systems is proposed. The design methodology is applied to an 8-story steel building with buckling restrained braces, located in soft soil of Mexico City. With the aim of choosing the optimal design parameters, a parametric study is developed considering different values of α and γ. The simplified methodology is for preliminary sizing, design, and evaluation of the effectiveness of HEDS, and it constitutes a modern and practical tool that enables the structural designer to select the best design parameters.

Keywords: damage-controlled buildings, direct displacement-based seismic design, optimal hysteretic energy dissipation systems, hysteretic dampers

Procedia PDF Downloads 483
2136 Optimal Control of Generators and Series Compensators within Multi-Space-Time Frame

Authors: Qian Chen, Lin Xu, Ping Ju, Zhuoran Li, Yiping Yu, Yuqing Jin

Abstract:

The operation of power grid is becoming more and more complex and difficult due to its rapid development towards high voltage, long distance, and large capacity. For instance, many large-scale wind farms have connected to power grid, where their fluctuation and randomness is very likely to affect the stability and safety of the grid. Fortunately, many new-type equipments based on power electronics have been applied to power grid, such as UPFC (Unified Power Flow Controller), TCSC (Thyristor Controlled Series Compensation), STATCOM (Static Synchronous Compensator) and so on, which can help to deal with the problem above. Compared with traditional equipment such as generator, new-type controllable devices, represented by the FACTS (Flexible AC Transmission System), have more accurate control ability and respond faster. But they are too expensive to use widely. Therefore, on the basis of the comparison and analysis of the controlling characteristics between traditional control equipment and new-type controllable equipment in both time and space scale, a coordinated optimizing control method within mutil-time-space frame is proposed in this paper to bring both kinds of advantages into play, which can better both control ability and economical efficiency. Firstly, the coordination of different space sizes of grid is studied focused on the fluctuation caused by large-scale wind farms connected to power grid. With generator, FSC (Fixed Series Compensation) and TCSC, the coordination method on two-layer regional power grid vs. its sub grid is studied in detail. The coordination control model is built, the corresponding scheme is promoted, and the conclusion is verified by simulation. By analysis, interface power flow can be controlled by generator and the specific line power flow between two-layer regions can be adjusted by FSC and TCSC. The smaller the interface power flow adjusted by generator, the bigger the control margin of TCSC, instead, the total consumption of generator is much higher. Secondly, the coordination of different time sizes is studied to further the amount of the total consumption of generator and the control margin of TCSC, where the minimum control cost can be acquired. The coordination method on two-layer ultra short-term correction vs. AGC (Automatic Generation Control) is studied with generator, FSC and TCSC. The optimal control model is founded, genetic algorithm is selected to solve the problem, and the conclusion is verified by simulation. Finally, the aforementioned method within multi-time-space scale is analyzed with practical cases, and simulated on PSASP (Power System Analysis Software Package) platform. The correctness and effectiveness are verified by the simulation result. Moreover, this coordinated optimizing control method can contribute to the decrease of control cost and will provide reference to the following studies in this field.

Keywords: FACTS, multi-space-time frame, optimal control, TCSC

Procedia PDF Downloads 267
2135 BIM-Based Tool for Sustainability Assessment and Certification Documents Provision

Authors: Taki Eddine Seghier, Mohd Hamdan Ahmad, Yaik-Wah Lim, Samuel Opeyemi Williams

Abstract:

The assessment of building sustainability to achieve a specific green benchmark and the preparation of the required documents in order to receive a green building certification, both are considered as major challenging tasks for green building design team. However, this labor and time-consuming process can take advantage of the available Building Information Modeling (BIM) features such as material take-off and scheduling. Furthermore, the workflow can be automated in order to track potentially achievable credit points and provide rating feedback for several design options by using integrated Visual Programing (VP) to handle the stored parameters within the BIM model. Hence, this study proposes a BIM-based tool that uses Green Building Index (GBI) rating system requirements as a unique input case to evaluate the building sustainability in the design stage of the building project life cycle. The tool covers two key models for data extraction, firstly, a model for data extraction, calculation and the classification of achievable credit points in a green template, secondly, a model for the generation of the required documents for green building certification. The tool was validated on a BIM model of residential building and it serves as proof of concept that building sustainability assessment of GBI certification can be automatically evaluated and documented through BIM.

Keywords: green building rating system, GBRS, building information modeling, BIM, visual programming, VP, sustainability assessment

Procedia PDF Downloads 326
2134 Large-Capacity Image Information Reduction Based on Single-Cue Saliency Map for Retinal Prosthesis System

Authors: Yili Chen, Xiaokun Liang, Zhicheng Zhang, Yaoqin Xie

Abstract:

In an effort to restore visual perception in retinal diseases, an electronic retinal prosthesis with thousands of electrodes has been developed. The image processing strategies of retinal prosthesis system converts the original images from the camera to the stimulus pattern which can be interpreted by the brain. Practically, the original images are with more high resolution (256x256) than that of the stimulus pattern (such as 25x25), which causes a technical image processing challenge to do large-capacity image information reduction. In this paper, we focus on developing an efficient image processing stimulus pattern extraction algorithm by using a single cue saliency map for extracting salient objects in the image with an optimal trimming threshold. Experimental results showed that the proposed stimulus pattern extraction algorithm performs quite well for different scenes in terms of the stimulus pattern. In the algorithm performance experiment, our proposed SCSPE algorithm have almost five times of the score compared with Boyle’s algorithm. Through experiment s we suggested that when there are salient objects in the scene (such as the blind meet people or talking with people), the trimming threshold should be set around 0.4max, in other situations, the trimming threshold values can be set between 0.2max-0.4max to give the satisfied stimulus pattern.

Keywords: retinal prosthesis, image processing, region of interest, saliency map, trimming threshold selection

Procedia PDF Downloads 246
2133 Fishing Waste: A Source of Valuable Products through Anaerobic Treatments

Authors: Luisa Maria Arrechea Fajardo, Luz Stella Cadavid Rodriguez

Abstract:

Fish is one of the most commercialized foods worldwide. However, this industry only takes advantage of about 55% of the product's weight, the rest is converted into waste, which is mainly composed of viscera, gills, scales and spines. Consequently, if these wastes are not used or disposed of properly, they cause serious environmental impacts. This is the case of Tumaco (Colombia), the second largest producer of marine fisheries on the Colombian Pacific coast, where artisanal fishermen process more than 50% of the commercialized volume. There, fishing waste is disposed primarily in the ocean, causing negative impacts on the environment and society. Therefore, in the present research, a proposal was made to take advantage of fishing waste through anaerobic treatments, through which it is possible to obtain products with high added value from organic waste. The research was carried out in four stages. First, the production of volatile fatty acids (VFA) in semi-continuous 4L reactors was studied, evaluating three hydraulic retention times (HRT) (10, 7 and 5 days) with four organic loading rates (OLR) (16, 14, 12 and 10 gVS/L/day), the experiment was carried out for 150 days. Subsequently, biogas production was evaluated from the solid digestate generated in the VFA production reactors, initially evaluating the biochemical methane potential (BMP) of 4 total solid concentrations (1, 2, 4 and 6% TS), for 40 days and then, with the optimum TS concentration (2 gVS/L/day), 2 HRT (15 and 20 days) in semi-continuous reactors, were evaluated for 100 days. Finally, the integration of the processes was carried out with the best conditions found, a first phase of VFA production from fishing waste and a second phase of biogas production from unrecovered VFAs and unprocessed material Additionally, an VFA membrane extraction system was included. In the first phase, a liquid digestate with a concentration and VFA production yield of 59.04 gVFA/L and 0.527 gVFA/gVS, respectively, was obtained, with the best condition found (HRT:7 days and OLR: 16 gVS/L/día), where acetic acid and isobutyric acid were the predominant acids. In the second phase of biogas production, a BMP of 0.349 Nm3CH4/KgVS was reached, and it was found as best HRT 20 days. In the integration, the isovaleric, butyric and isobutyric acid were the VFA with the highest percentage of extraction, additionally a 106.67% increase in biogas production was achieved. This research shows that anaerobic treatments are a promising technology for an environmentally safe management of fishing waste and presents the basis of a possible biorefinery.

Keywords: biogas production, fishing waste, VFA membrane extraction, VFA production

Procedia PDF Downloads 117
2132 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing

Procedia PDF Downloads 188
2131 Analysis of Energy Flows as An Approach for The Formation of Monitoring System in the Sustainable Regional Development

Authors: Inese Trusina, Elita Jermolajeva

Abstract:

Global challenges require a transition from the existing linear economic model to a model that will consider nature as a life support system for the developmenton the way to social well-being in the frame of the ecological economics paradigm. The article presentsbasic definitions for the development of formalized description of sustainabledevelopment monitoring. It provides examples of calculating the parameters of monitoring for the Baltic Sea region countries and their primary interpretation.

Keywords: sustainability, development, power, ecological economics, regional economic, monitoring

Procedia PDF Downloads 120
2130 A Survey to Determine the Incidence of Piglets' Mortality in Outdoor Farms in New Zealand

Authors: Patrick C. H. Morel, Ian W. Barugh, Kirsty L. Chidgey

Abstract:

The aim of this study was to quantify the level of piglet deaths in outdoor farrowing systems in New Zealand. A total of 14 farms were visited, the farmers interviewed, and data collected. A total of 10,154 sows were kept on those farms representing an estimated 33% of the NZ sow herd or 80% of the outdoor sow herd in 2016. Data from 25,911 litters was available for the different analyses. The characteristics and reproductive performance for the years 2015-2016 from the 14 farms surveyed in this study were analysed, and the following results were obtained. The average percentage of stillbirths was 7.1% ranging between 3.5 and 10.7%, and the average pre-weaning live-born mortality was 16.7% ranging between 3.7% and 23.6%. The majority of piglet deaths (89%) occurred during the first week after birth, with 81% of deaths occurring up to day three. The number of piglets born alive was 12.3 (8.0 to 14.0), and average number of piglets weaned per sow per year was 22.4, range 10.5-27.3. The average stocking rate per ha (number of sows and mated gilts) was 15.3 and ranged from 2.8 to 28.6. The sow to boar ratio average was 20.9:1 and the range was 7.1: 1 to 63:1. The sow replacement rate ranged between 37% and 78%. There was a large variation in the piglet live-born mortality both between months within a farm and between farms within a given month. The monthly recorded piglet mortality ranged between 7.7% and 31.5%, and there was no statistically significant difference between months on the number of piglets born, born alive, weaned or on pre-weaning piglet mortality. Twelve different types of hut/farrowing systems were used on the 14 farms. No difference in piglet mortality was observed between A-Frame, A-Frame Modified and for Box-shape huts. There was a positive relationship between the average number of piglets born per litter and the number of piglets born alive (r=0.975) or the number weaned per litter (r=0.845). Moreover, as the average number of piglets born-alive increases, both pre-weaning live-born mortality rate and the number of piglets weaned increased. An increase of 1 piglet in the number born alive corresponds to an increase of 2.9% in live-born mortality and an increase of 0.56 piglets weaned. Farmers reported that staff are the key to success with the key attributes being: good and reliable with attention to detail and skills with the stock.

Keywords: mortality, piglets, outdoor, pig farm

Procedia PDF Downloads 115
2129 Numerical Investigation of Nanofluid Based Thermosyphon System

Authors: Kiran Kumar K., Ramesh Babu Bejjam, Atul Najan

Abstract:

A thermosyphon system is a heat transfer loop which operates on the basis of gravity and buoyancy forces. It guarantees a good reliability and low maintenance cost as it does not involve any mechanical pump. Therefore it can be used in many industrial applications such as refrigeration and air conditioning, electronic cooling, nuclear reactors, geothermal heat extraction, etc. But flow instabilities and loop configuration are the major problems in this system. Several previous researchers studied that stabilities can be suppressed by using nanofluids as loop fluid. In the present study a rectangular thermosyphon loop with end heat exchangers are considered for the study. This configuration is more appropriate for many practical applications such as solar water heater, geothermal heat extraction, etc. In the present work, steady-state analysis is carried out on thermosyphon loop with parallel flow coaxial heat exchangers at heat source and heat sink. In this loop nano fluid is considered as the loop fluid and water is considered as the external fluid in both hot and cold heat exchangers. For this analysis one-dimensional homogeneous model is developed. In this model, conservation equations like conservation of mass, momentum, energy are discretized using finite difference method. A computer code is written in MATLAB to simulate the flow in thermosyphon loop. A comparison in terms of heat transfer is made between water and nano fluid as working fluids in the loop.

Keywords: heat exchanger, heat transfer, nanofluid, thermosyphon loop

Procedia PDF Downloads 477
2128 High Performance Liquid Cooling Garment (LCG) Using ThermoCore

Authors: Venkat Kamavaram, Ravi Pare

Abstract:

Modern warfighters experience extreme environmental conditions in many of their operational and training activities. In temperatures exceeding 95°F, the body’s temperature regulation can no longer cool through convection and radiation. In this case, the only cooling mechanism is evaporation. However, evaporative cooling is often compromised by excessive humidity. Natural cooling mechanisms can be further compromised by clothing and protective gear, which trap hot air and moisture close to the body. Creating an efficient heat extraction apparel system that is also lightweight without hindering dexterity or mobility of personnel working in extreme temperatures is a difficult technical challenge and one that needs to be addressed to increase the probability for the future success of the US military. To address this challenge, Oceanit Laboratories, Inc. has developed and patented a Liquid Cooled Garment (LCG) more effective than any on the market today. Oceanit’s LCG is a form-fitting garment with a network of thermally conductive tubes that extracts body heat and can be worn under all authorized and chemical/biological protective clothing. Oceanit specifically designed and developed ThermoCore®, a thermally conductive polymer, for use in this apparel, optimizing the product for thermal conductivity, mechanical properties, manufacturability, and performance temperatures. Thermal Manikin tests were conducted in accordance with the ASTM test method, ASTM F2371, Standard Test Method for Measuring the Heat Removal Rate of Personal Cooling Systems Using a Sweating Heated Manikin, in an environmental chamber using a 20-zone sweating thermal manikin. Manikin test results have shown that Oceanit’s LCG provides significantly higher heat extraction under the same environmental conditions than the currently fielded Environmental Control Vest (ECV) while at the same time reducing the weight. Oceanit’s LCG vests performed nearly 30% better in extracting body heat while weighing 15% less than the ECV. There are NO cooling garments in the market that provide the same thermal extraction performance, form-factor, and reduced weight as Oceanit’s LCG. The two cooling garments that are commercially available and most commonly used are the Environmental Control Vest (ECV) and the Microclimate Cooling Garment (MCG).

Keywords: thermally conductive composite, tubing, garment design, form fitting vest, thermocore

Procedia PDF Downloads 115
2127 Traffic Prediction with Raw Data Utilization and Context Building

Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao

Abstract:

Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.

Keywords: traffic prediction, raw data utilization, context building, data reduction

Procedia PDF Downloads 128
2126 Green Synthesis of Magnetic, Silica Nanocomposite and Its Adsorptive Performance against Organochlorine Pesticides

Authors: Waleed A. El-Said, Dina M. Fouad, Mohamed H. Aly, Mohamed A. El-Gahami

Abstract:

Green synthesis of nanomaterials has received increasing attention as an eco-friendly technology in materials science. Here, we have used two types of extractions from green tea leaf (i.e. total extraction and tannin extraction) as reducing agents for a rapid, simple and one step synthesis method of mesoporous silica nanoparticles (MSNPs)/iron oxide (Fe3O4) nanocomposite based on deposition of Fe3O4 onto MSNPs. MSNPs/Fe3O4 nanocomposite were characterized by X-ray diffraction, Fourier transform infrared spectroscopy, scanning electron microscopy, energy dispersive X-ray, vibrating sample magnetometer, N2 adsorption, and high-resolution transmission electron microscopy. The average mesoporous silica particle diameter was found to be around 30 nm with high surface area (818 m2/gm). MSNPs/Fe3O4 nanocomposite was used for removing lindane pesticide (an environmental hazard material) from aqueous solutions. Fourier transform infrared, UV-vis, High-performance liquid chromatography and gas chromatography techniques were used to confirm the high ability of MSNPs/Fe3O4 nanocomposite for sensing and capture of lindane molecules with high sorption capacity (more than 89%) that could develop a new eco-friendly strategy for detection and removing of pesticide and as a promising material for water treatment application.

Keywords: green synthesis, mesoporous silica, magnetic iron oxide NPs, adsorption Lindane

Procedia PDF Downloads 436
2125 Spent Paint Solvent Recoveries by Ionic Liquids: Potential for Industrial Application

Authors: Mbongeni Mabaso, Kandasamy Moodley, Gan Redhi

Abstract:

The recovery of industrially valuable organic solvents from liquid waste, generated in chemical processes, is economically crucial to countries which need to import organic solvents. In view of this, the main objective of this study was to determine the ability of selected ionic liquids, namely, 1-ethyl-3-methylimidazolium ethylsulphate, [EMIM] [ESO4] and 1-ethyl-3-methylpyridinium ethylsulphate, [EMpy][ESO4] to recover aromatic components from spent paint solvents. Preliminary studies done on the liquid waste, received from a paint manufacturing company, showed that the aromatic components were present in the range 6 - 21 % by volume. The separation of the aromatic components was performed with the ionic liquids listed above. The phases, resulting from the separation of the mixtures, were analysed with a Gas Chromatograph (GC) coupled to a FID detector. Chromatograms illustrate that the chosen ZB-Wax-Plus column gave excellent separation of all components of interest from the mixtures, including the isomers of xylene. The concentrations of aromatics recovered from the spent solvents were found to be the % ranges 13-33 and 23-49 respectively for imidazolium and pyridinium ionic liquids. These results also show that there is a significant correlation between π-character of ionic liquids and the level of extraction. It is therefore concluded that ionic liquids have the potential for macro-scale recovery of re-useable solvents present in liquid waste emanating from paint manufacture.

Keywords: synthesis, ionic liquid, imidazolium, pyridinium, extraction, aromatic solvents, spent paint organic solvents

Procedia PDF Downloads 337
2124 Development of a New Characterization Method to Analyse Cypermethrin Penetration in Wood Material by Immunolabelling

Authors: Sandra Tapin-Lingua, Katia Ruel, Jean-Paul Joseleau, Daouia Messaoudi, Olivier Fahy, Michel Petit-Conil

Abstract:

The preservative efficacy of organic biocides is strongly related to their capacity of penetration and retention within wood tissues. The specific detection of the pyrethroid insecticide is currently obtained after extraction followed by chemical analysis by chromatography techniques. However visualizing the insecticide molecule within the wood structure requires specific probes together with microscopy techniques. Therefore, the aim of the present work was to apply a new methodology based on antibody-antigen recognition and electronic microscopy to visualize directly pyrethroids in the wood material. A polyclonal antibody directed against cypermethrin was developed and implement it on Pinus sylvestris wood samples coated with technical cypermethrin. The antibody was tested on impregnated wood and the specific recognition of the insecticide was visualized in transmission electron microscopy (TEM). The immunogold-TEM assay evidenced the capacity of the synthetic biocide to penetrate in the wood. The depth of penetration was measured on sections taken at increasing distances from the coated surface of the wood. Such results correlated with chemical analyzes carried out by GC-ECD after extraction. In addition, the immuno-TEM investigation allowed visualizing, for the first time at the ultrastructure scale of resolution, that cypermethrin was able to diffuse within the secondary wood cell walls.

Keywords: cypermethrin, insecticide, wood penetration, wood retention, immuno-transmission electron microscopy, polyclonal antibody

Procedia PDF Downloads 413
2123 The Impact of the Method of Extraction on 'Chemchali' Olive Oil Composition in Terms of Oxidation Index, and Chemical Quality

Authors: Om Kalthoum Sallem, Saidakilani, Kamiliya Ounaissa, Abdelmajid Abid

Abstract:

Introduction and purposes: Olive oil is the main oil used in the Mediterranean diet. Virgin olive oil is valued for its organoleptic and nutritional characteristics and is resistant to oxidation due to its high monounsaturated fatty acid content (MUFAs), and low polyunsaturates (PUFAs) and the presence of natural antioxidants such as phenols, tocopherols and carotenoids. The fatty acid composition, especially the MUFA content, and the natural antioxidants provide advantages for health. The aim of the present study was to examine the impact of method of extraction on the chemical profiles of ‘Chemchali’ olive oil variety, which is cultivated in the city of Gafsa, and to compare it with chetoui and chemchali varieties. Methods: Our study is a qualitative prospective study that deals with ‘Chemchali’ olive oil variety. Analyses were conducted during three months (from December to February) in different oil mills in the city of Gafsa. We have compared ‘Chemchali’ olive oil obtained by continuous method to this obtained by superpress method. Then we have analyzed quality index parameters, including free fatty acid content (FFA), acidity, and UV spectrophotometric characteristics and other physico-chemical data [oxidative stability, ß-carotene, and chlorophyll pigment composition]. Results: Olive oil resulting from super press method compared with continuous method is less acid(0,6120 vs. 0,9760), less oxydazible(K232:2,478 vs. 2,592)(k270:0,216 vs. 0,228), more rich in oleic acid(61,61% vs. 66.99%), less rich in linoleic acid(13,38% vs. 13,98 %), more rich in total chlorophylls pigments (6,22 ppm vs. 3,18 ppm ) and ß-carotene (3,128 mg/kg vs. 1,73 mg/kg). ‘Chemchali’ olive oil showed more equilibrated total content in fatty acids compared with the varieties ’Chemleli’ and ‘Chetoui’. Gafsa’s variety ’Chemlali’ have significantly less saturated and polyunsaturated fatty acids. Whereas it has a higher content in monounsaturated fatty acid C18:2, compared with the two other varieties. Conclusion: The use of super press method had benefic effects on general chemical characteristics of ‘Chemchali’ olive oil, maintaining the highest quality according to the ecocert legal standards. In light of the results obtained in this study, a more detailed study is required to establish whether the differences in the chemical properties of oils are mainly due to agronomic and climate variables or, to the processing employed in oil mills.

Keywords: olive oil, extraction method, fatty acids, chemchali olive oil

Procedia PDF Downloads 383
2122 Purification, Extraction and Visualization of Lipopolysaccharide of Escherichia coli from Urine Samples of Patients with Urinary Tract Infection

Authors: Fariha Akhter Chowdhury, Mohammad Nurul Islam, Anamika Saha, Sabrina Mahboob, Abu Syed Md. Mosaddek, Md. Omar Faruque, Most. Fahmida Begum, Rajib Bhattacharjee

Abstract:

Urinary tract infection (UTI) is one of the most common infectious diseases in Bangladesh where Escherichia coli is the prevalent organism and responsible for most of the infections. Lipopolysaccharide (LPS) is known to act as a major virulence factor of E. coli. The present study aimed to purify, extract and visualize LPS of E. coli clinical isolates from urine samples of patients with UTI. The E. coli strain was isolated from the urine samples of 10 patients with UTI and then the antibiotic sensitivity pattern of the isolates was determined. The purification of LPS was carried out using the hot aqueous-phenol method and separated by sodium dodecyl sulfate polyacrylamide gel electrophoresis, which was directly stained using the modified silver staining method and Coomassie blue. The silver-stained gel demonstrated both smooth and rough type LPS by showing trail-like band patterns with the presence and lacking O-antigen region, respectively. Coomassie blue staining showed no band assuring the absence of any contaminating protein. Our successful extraction of purified LPS from E. coli isolates of UTI patients’ urine samples can be an important step to understand the UTI disease conditions.

Keywords: Escherichia coli, electrophoresis, polyacrylamide gel, silver staining, sodium dodecyl sulfate polyacrylamide gel electrophoresis (SDS-PAGE)

Procedia PDF Downloads 389
2121 Terrorism in German and Italian Press Headlines: A Cognitive Linguistic Analysis of Conceptual Metaphors

Authors: Silvia Sommella

Abstract:

Islamic terrorism has gained a lot of media attention in the last years also because of the striking increase of terror attacks since 2014. The main aim of this paper is to illustrate the phenomenon of Islamic terrorism by applying frame semantics and metaphor analysis to German and Italian press headlines of the two online weekly publications Der Spiegel and L’Espresso between 2014 and 2019. This study focuses on how media discourse – through the use of conceptual metaphors – let arise in people a particular reception of the phenomenon of Islamic terrorism and accept governmental strategies and policies, perceiving terrorists as evildoers, as the members of an uncivilised group ‘other’ opposed to the civilised group ‘we’: two groups that are perceived as opposed. The press headlines are analyzed on the basis of the cognitive linguistics, namely Lakoff and Johnson’s conceptualization of metaphor to distinguish between abstract conceptual metaphors and specific metaphorical expressions. The study focuses on the contexts, frames, and metaphors. The method adopted in this study is Konerding’s frame semantics (1993). Konerding carried out on the basis of dictionaries – in particular of the Duden Deutsches Universalwörterbuch (Duden Universal German Dictionary) – in a pilot study of a lexicological work hyperonym reduction of substantives, working exclusively with nouns because hyperonyms usually occur in the dictionary meaning explanations as for the main elements of nominal phrases. The results of Konerding’s hyperonym type reduction is a small set of German nouns and they correspond to the highest hyperonyms, the so-called categories, matrix frames: ‘object’, ‘organism’, ‘person/actant’, ‘event’, ‘action/interaction/communication’, ‘institution/social group’, ‘surroundings’, ‘part/piece’, ‘totality/whole’, ‘state/property’. The second step of Konerding’s pilot study consists in determining the potential reference points of each category so that conventionally expectable routinized predications arise as predictors. Konerding found out which predicators the ascertained noun types can be linked to. For the purpose of this study, metaphorical expressions will be listed and categorized in conceptual metaphors and under the matrix frames that correspond to the particular conceptual metaphor. All of the corpus analyses are carried out using Ant Conc corpus software. The research will verify some previously analyzed metaphors such as TERRORISM AS WAR, A CRIME, A NATURAL EVENT, A DISEASE and will identify new conceptualizations and metaphors about Islamic terrorism, especially in the Italian language like TERRORISM AS A GAME, WARES, A DRAMATIC PLAY. Through the identification of particular frames and their construction, the research seeks to understand the public reception and the way to handle the discourse about Islamic terrorism in the above mentioned online weekly publications under a contrastive analysis in the German and in the Italian language.

Keywords: cognitive linguistics, frame semantics, Islamic terrorism, media

Procedia PDF Downloads 173