Search results for: data source.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8522

Search results for: data source.

6302 Effect of Gamma Irradiation on the Crystalline Structure of Poly(Vinylidene Fluoride)

Authors: Adriana Souza M. Batista, Cláubia Pereira, Luiz O. Faria

Abstract:

The irradiation of polymeric materials has received much attention because it can produce diverse changes in chemical structure and physical properties. Thus, studying the chemical and structural changes of polymers is important in practice to achieve optimal conditions for the modification of polymers. The effect of gamma irradiation on the crystalline structure of poly(vinylidene fluoride) (PVDF) has been investigated using differential scanning calorimetry (DSC) and X-ray diffraction techniques (XRD). Gamma irradiation was carried out in atmosphere air with doses between 100 kGy at 3,000 kGy with a Co-60 source. In the melting thermogram of the samples irradiated can be seen a bimodal melting endotherm is detected with two melting temperature. The lower melting temperature is attributed to melting of crystals originally present and the higher melting peak due to melting of crystals reorganized upon heat treatment. These results are consistent with those obtained by XRD technique showing increasing crystallinity with increasing irradiation dose, although the melting latent heat is decreasing.

Keywords: Differential scanning calorimetry, gamma irradiation, PVDF, X-ray diffraction technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1617
6301 Optimized Preprocessing for Accurate and Efficient Bioassay Prediction with Machine Learning Algorithms

Authors: Jeff Clarine, Chang-Shyh Peng, Daisy Sang

Abstract:

Bioassay is the measurement of the potency of a chemical substance by its effect on a living animal or plant tissue. Bioassay data and chemical structures from pharmacokinetic and drug metabolism screening are mined from and housed in multiple databases. Bioassay prediction is calculated accordingly to determine further advancement. This paper proposes a four-step preprocessing of datasets for improving the bioassay predictions. The first step is instance selection in which dataset is categorized into training, testing, and validation sets. The second step is discretization that partitions the data in consideration of accuracy vs. precision. The third step is normalization where data are normalized between 0 and 1 for subsequent machine learning processing. The fourth step is feature selection where key chemical properties and attributes are generated. The streamlined results are then analyzed for the prediction of effectiveness by various machine learning algorithms including Pipeline Pilot, R, Weka, and Excel. Experiments and evaluations reveal the effectiveness of various combination of preprocessing steps and machine learning algorithms in more consistent and accurate prediction.

Keywords: Bioassay, machine learning, preprocessing, virtual screen.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 981
6300 Removal of Methylene Blue from Aqueous Solution by Using Gypsum as a Low Cost Adsorbent

Authors: Muhammad A.Rauf, I.Shehadeh, Amal Ahmed, Ahmed Al-Zamly

Abstract:

Removal of Methylene Blue (MB) from aqueous solution by adsorbing it on Gypsum was investigated by batch method. The studies were conducted at 25°C and included the effects of pH and initial concentration of Methylene Blue. The adsorption data was analyzed by using the Langmuir, Freundlich and Tempkin isotherm models. The maximum monolayer adsorption capacity was found to be 36 mg of the dye per gram of gypsum. The data were also analyzed in terms of their kinetic behavior and was found to obey the pseudo second order equation.

Keywords: Adsorption, Dye, Gypsum, Kinetics, Methylene Blue.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2679
6299 Analysis of the CO2 Emissions of Public Passenger Transport in Tianjin City of China

Authors: Tao Zhao, Xianshuo Xu

Abstract:

Low-carbon public passenger transport is an important part of low carbon city. The CO2 emissions of public passenger transport in Tianjin from 1995 to 2010 are estimated with IPCC CO2 counting method, which shows that the total CO2 emissions of Tianjin public passenger transport have gradually become stable at 1,425.1 thousand tons. And then the CO2 emissions of the buses, taxies, and rail transits are calculated respectively. A CO2 emission of 829.9 thousand tons makes taxies become the largest CO2 emissions source among the public passenger transport in Tianjin. Combining with passenger volume, this paper analyzes the CO2 emissions proportion of the buses, taxies, and rail transits compare the passenger transport rate with the proportion of CO2 emissions, as well as the CO2 emissions change of per 10,000 people. The passenger volume proportion of bus among the three public means of transport is 72.62% which is much higher than its CO2 emissions proportion of 36.01%, with the minimum number of CO2 emissions per 10,000 people of 4.90 tons. The countermeasures to reduce CO2 emissions of public passenger transport in Tianjin are to develop rail transit, update vehicles and use alternative fuel vehicles.

Keywords: Public passenger transport, carbon emissions, countermeasures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1060
6298 The Traffic Prediction Multi-path Energy-aware Source Routing (TP-MESR)in Ad hoc Networks

Authors: Su Jin Kim, Ji Yeon Cho, Bong Gyou Lee

Abstract:

The purpose of this study is to suggest energy efficient routing for ad hoc networks which are composed of nodes with limited energy. There are diverse problems including limitation of energy supply of node, and the node energy management problem has been presented. And a number of protocols have been proposed for energy conservation and energy efficiency. In this study, the critical point of the EA-MPDSR, that is the type of energy efficient routing using only two paths, is improved and developed. The proposed TP-MESR uses multi-path routing technique and traffic prediction function to increase number of path more than 2. It also verifies its efficiency compared to EA-MPDSR using network simulator (NS-2). Also, To give a academic value and explain protocol systematically, research guidelines which the Hevner(2004) suggests are applied. This proposed TP-MESR solved the existing multi-path routing problem related to overhead, radio interference, packet reassembly and it confirmed its contribution to effective use of energy in ad hoc networks.

Keywords: Ad hoc, energy-aware, multi-path, routing protocol, traffic prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1558
6297 Design and Analysis of 1.4 MW Hybrid Saps System for Rural Electrification in Off-Grid Applications

Authors: Arpan Dwivedi, Yogesh Pahariya

Abstract:

In this paper, optimal design of hybrid standalone power supply system (SAPS) is done for off grid applications in remote areas where transmission of power is difficult. The hybrid SAPS system uses two primary energy sources, wind and solar, and in addition to these diesel generator is also connected to meet the load demand in case of failure of wind and solar system. This paper presents mathematical modeling of 1.4 MW hybrid SAPS system for rural electrification. This paper firstly focuses on mathematical modeling of PV module connected in a string, secondly focuses on modeling of permanent magnet wind turbine generator (PMWTG). The hybrid controller is also designed for selection of power from the source available as per the load demand. The power output of hybrid SAPS system is analyzed for meeting load demands at urban as well as for rural areas.

Keywords: SAPS, DG, PMWTG, rural area, off grid, PV module.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 846
6296 High Secure Data Hiding Using Cropping Image and Least Significant Bit Steganography

Authors: Khalid A. Al-Afandy, El-Sayyed El-Rabaie, Osama Salah, Ahmed El-Mhalaway

Abstract:

This paper presents a high secure data hiding technique using image cropping and Least Significant Bit (LSB) steganography. The predefined certain secret coordinate crops will be extracted from the cover image. The secret text message will be divided into sections. These sections quantity is equal the image crops quantity. Each section from the secret text message will embed into an image crop with a secret sequence using LSB technique. The embedding is done using the cover image color channels. Stego image is given by reassembling the image and the stego crops. The results of the technique will be compared to the other state of art techniques. Evaluation is based on visualization to detect any degradation of stego image, the difficulty of extracting the embedded data by any unauthorized viewer, Peak Signal-to-Noise Ratio of stego image (PSNR), and the embedding algorithm CPU time. Experimental results ensure that the proposed technique is more secure compared with the other traditional techniques.

Keywords: Steganography, stego, LSB, crop.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1551
6295 Decision Trees for Predicting Risk of Mortality using Routinely Collected Data

Authors: Tessy Badriyah, Jim S. Briggs, Dave R. Prytherch

Abstract:

It is well known that Logistic Regression is the gold standard method for predicting clinical outcome, especially predicting risk of mortality. In this paper, the Decision Tree method has been proposed to solve specific problems that commonly use Logistic Regression as a solution. The Biochemistry and Haematology Outcome Model (BHOM) dataset obtained from Portsmouth NHS Hospital from 1 January to 31 December 2001 was divided into four subsets. One subset of training data was used to generate a model, and the model obtained was then applied to three testing datasets. The performance of each model from both methods was then compared using calibration (the χ2 test or chi-test) and discrimination (area under ROC curve or c-index). The experiment presented that both methods have reasonable results in the case of the c-index. However, in some cases the calibration value (χ2) obtained quite a high result. After conducting experiments and investigating the advantages and disadvantages of each method, we can conclude that Decision Trees can be seen as a worthy alternative to Logistic Regression in the area of Data Mining.

Keywords: Decision Trees, Logistic Regression, clinical outcome, risk of mortality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2523
6294 Fermentation of Xylose and Glucose Mixture in Intensified Reactors by Scheffersomyces stipitis to Produce Ethanol

Authors: S. C. Santos, S. R. Dionísio, A. L. D. De Andrade, L. R. Roque, A. C. Da Costa, J. L. Ienczak

Abstract:

In this work, two fermentations at different temperatures (25 and 30ºC), with cell recycling, were accomplished to produce ethanol, using a mix of commercial substrates, xylose (70%) and glucose (30%), as organic source for Scheffersomyces stipitis. Five consecutive fermentations of 80 g L-1 (1º, 2º and 3º recycles), 96 g L-1 (4º recycle) and 120 g L-1 (5º recycle)reduced sugars led to a final maximum ethanol concentration of 17.2 and 34.5 g L-1, at 25 and 30ºC, respectively. Glucose was the preferred substrate; moreover xylose startup degradation was initiated after a remaining glucose presence in the medium. Results showed that yeast acid treatment, performed before each cycle, provided improvements on cell viability, accompanied by ethanol productivity of 2.16 g L-1 h- 1 at 30ºC. A maximum 36% of xylose was retained in the fermentation medium and after five-cycle fermentation an ethanol yield of 0.43 g ethanol/g sugars was observed. S. stipitis fermentation capacity and tolerance showed better results at 30ºC with 83.4% of theoretical yield referenced on initial biomass.

Keywords: 5-carbon sugar, cell recycling fermenter, mixed sugars, xylose-fermenting yeast.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2758
6293 From Industry 4.0 to Agriculture 4.0: A Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability

Authors: Angelo Corallo, Maria Elena Latino, Marta Menegoli

Abstract:

Agri-food value chain involves various stakeholders with different roles. All of them abide by national and international rules and leverage marketing strategies to advance their products. Food products and related processing phases carry with it a big mole of data that are often not used to inform final customer. Some data, if fittingly identified and used, can enhance the single company, and/or the all supply chain creates a math between marketing techniques and voluntary traceability strategies. Moreover, as of late, the world has seen buying-models’ modification: customer is careful on wellbeing and food quality. Food citizenship and food democracy was born, leveraging on transparency, sustainability and food information needs. Internet of Things (IoT) and Analytics, some of the innovative technologies of Industry 4.0, have a significant impact on market and will act as a main thrust towards a genuine ‘4.0 change’ for agriculture. But, realizing a traceability system is not simple because of the complexity of agri-food supply chain, a lot of actors involved, different business models, environmental variations impacting products and/or processes, and extraordinary climate changes. In order to give support to the company involved in a traceability path, starting from business model analysis and related business process a Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability was conceived. Studying each process task and leveraging on modeling techniques lead to individuate information held by different actors during agri-food supply chain. IoT technologies for data collection and Analytics techniques for data processing supply information useful to increase the efficiency intra-company and competitiveness in the market. The whole information recovered can be shown through IT solutions and mobile application to made accessible to the company, the entire supply chain and the consumer with the view to guaranteeing transparency and quality.

Keywords: Agriculture 4.0, agri-food supply chain, Industry 4.0, voluntary traceability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2348
6292 Info-participation of the Disabled Using the Mixed Preference Data in Improving Their Travel Quality

Authors: Y. Duvarci, S. Mizokami

Abstract:

Today, the preferences and participation of the TD groups such as the elderly and disabled is still lacking in decision-making of transportation planning, and their reactions to certain type of policies are not well known. Thus, a clear methodology is needed. This study aimed to develop a method to extract the preferences of the disabled to be used in the policy-making stage that can also guide to future estimations. The method utilizes the combination of cluster analysis and data filtering using the data of the Arao city (Japan). The method is a process that follows: defining the TD group by the cluster analysis tool, their travel preferences in tabular form from the household surveys by policy variableimpact pairs, zones, and by trip purposes, and the final outcome is the preference probabilities of the disabled. The preferences vary by trip purpose; for the work trips, accessibility and transit system quality policies with the accompanying impacts of modal shifts towards public mode use as well as the decreasing travel costs, and the trip rate increase; for the social trips, the same accessibility and transit system policies leading to the same mode shift impact, together with the travel quality policy area leading to trip rate increase. These results explain the policies to focus and can be used in scenario generation in models, or any other planning purpose as decision support tool.

Keywords: Transportation Disadvantaged, Disabled, Mixed Preference, Stated Preference Data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1079
6291 Application of Genetic Engineering for Chromium Removal from Industrial Wastewater

Authors: N. K. Srivastava, M. K. Jha, I. D. Mall, Davinder Singh

Abstract:

The treatment of the industrial wastewater can be particularly difficult in the presence of toxic compounds. Excessive concentration of Chromium in soluble form is toxic to a wide variety of living organisms. Biological removal of heavy metals using natural and genetically engineered microorganisms has aroused great interest because of its lower impact on the environment. Ralston metallidurans, formerly known as Alcaligenes eutrophus is a LProteobacterium colonizing industrial wastewater with a high content of heavy metals. Tris-buffered mineral salt medium was used for growing Alcaligenes eutrophus AE104 (pEBZ141). The cells were cultivated for 18 h at 30 oC in Tris-buffered mineral salt medium containing 3 mM disodium sulphate and 46 mM sodium gluconate as the carbon source. The cells were harvested by centrifugation, washed, and suspended in 10 mM Tris HCl, pH 7.0, containing 46 mM sodium gluconate, and 5 mM Chromium. Interaction among induction of chr resistance determinant, and chromate reduction have been demonstrated. Results of this study show that the above bacteria can be very useful for bioremediation of chromium from industrial wastewater.

Keywords: Chromium, Genetic Engineering, IndustrialWastewater, Plasmid

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2341
6290 Analyzing Current Transformer’s Transient and Steady State Behavior for Different Burden’s Using LabVIEW Data Acquisition Tool

Authors: D. Subedi, D. Sharma

Abstract:

Current transformers (CTs) are used to transform large primary currents to a small secondary current. Since most standard equipment’s are not designed to handle large primary currents the CTs have an important part in any electrical system for the purpose of Metering and Protection both of which are integral in Power system. Now a days due to advancement in solid state technology, the operation times of the protective relays have come to a few cycles from few seconds. Thus, in such a scenario it becomes important to study the transient response of the current transformers as it will play a vital role in the operating of the protective devices.

This paper shows the steady state and transient behavior of current transformers and how it changes with change in connected burden. The transient and steady state response will be captured using the data acquisition software LabVIEW. Analysis is done on the real time data gathered using LabVIEW. Variation of current transformer characteristics with changes in burden will be discussed.

Keywords: Accuracy, Accuracy limiting factor, Burden, Current Transformer, Instrument Security factor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3318
6289 A Metric-Set and Model Suggestion for Better Software Project Cost Estimation

Authors: Murat Ayyıldız, Oya Kalıpsız, Sırma Yavuz

Abstract:

Software project effort estimation is frequently seen as complex and expensive for individual software engineers. Software production is in a crisis. It suffers from excessive costs. Software production is often out of control. It has been suggested that software production is out of control because we do not measure. You cannot control what you cannot measure. During last decade, a number of researches on cost estimation have been conducted. The metric-set selection has a vital role in software cost estimation studies; its importance has been ignored especially in neural network based studies. In this study we have explored the reasons of those disappointing results and implemented different neural network models using augmented new metrics. The results obtained are compared with previous studies using traditional metrics. To be able to make comparisons, two types of data have been used. The first part of the data is taken from the Constructive Cost Model (COCOMO'81) which is commonly used in previous studies and the second part is collected according to new metrics in a leading international company in Turkey. The accuracy of the selected metrics and the data samples are verified using statistical techniques. The model presented here is based on Multi-Layer Perceptron (MLP). Another difficulty associated with the cost estimation studies is the fact that the data collection requires time and care. To make a more thorough use of the samples collected, k-fold, cross validation method is also implemented. It is concluded that, as long as an accurate and quantifiable set of metrics are defined and measured correctly, neural networks can be applied in software cost estimation studies with success

Keywords: Software Metrics, Software Cost Estimation, Neural Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1957
6288 Leaching of Mineral Nitrogen and Phosphate from Rhizosphere Soil Stressed by Drought and Intensive Rainfall

Authors: J. Elbl, J. K. Friedel, J. Záhora, L. Plošek, A. Kintl, J. Přichystalová, J. Hynšt, L. Dostálová, K. Zákoutská

Abstract:

This work presents the first results from the long-term experiment, which is focused on the impact of intensive rainfall and long period of drought on microbial activities in soil. Fifteen lysimeters were prepared in the area of our interest. This area is a protection zone of underground source of drinking water. These lysimeters were filed with topsoil and subsoil collected in this area and divided into two groups. These groups differ in fertilization and amount of water received during the growing season. Amount of microbial biomass and leaching of mineral nitrogen and phosphates were chosen as main indicators of microbial activities in soil. Content of mineral nitrogen and phosphates was measured in soil solution, which was collected from each lysimeters. Amount of microbial biomass was determined in soil samples that were taken from the lysimeters before and after the long period of drought and intensive rainfall.

Keywords: Mineral nitrogen, Phosphates, Microbial activities, Drought, Precipitation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2146
6287 Synthesis of New Bio-Based Solid Polymer Electrolyte Polyurethane-LiClO4 via Prepolymerization Method: Effect of NCO/OH Ratio on Their Chemical, Thermal Properties and Ionic Conductivity

Authors: C. S. Wong, K. H. Badri, N. Ataollahi, K. P. Law, M. S. Su’ait, N. I. Hassan

Abstract:

Novel bio-based polymer electrolyte was synthesized with LiClO4 as the main source of charge carrier. Initially, polyurethane-LiClO4 polymer electrolytes were synthesized via prepolymerization method with different NCO/OH ratios and labelled them as PU1, PU2, PU3 and PU4. Fourier transform infrared (FTIR) analysis indicates the co-ordination between Li+ ion and polyurethane in PU1. Differential scanning calorimetry (DSC) analysis indicates PU1 has the highest glass transition temperature (Tg) corresponds to the most abundant urethane group which is the hard segment in PU1. Scanning electron microscopy (SEM) shows the good miscibility between lithium salt and the polymer. The study found that PU1 possessed the greatest ionic conductivity and the lowest activation energy, Ea. All the polyurethanes exhibited linear Arrhenius variations indicating ion transport via simple lithium ion hopping in polyurethane. This research proves the NCO content in polyurethane plays an important role in affecting the ionic conductivity of this polymer electrolyte.

Keywords: Ionic conductivity, Palm kernel oil-based monoester polyol, polyurethane, solid polymer electrolyte.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3144
6286 Degraded Document Analysis and Extraction of Original Text Document: An Approach without Optical Character Recognition

Authors: L. Hamsaveni, Navya Prakash, Suresha

Abstract:

Document Image Analysis recognizes text and graphics in documents acquired as images. An approach without Optical Character Recognition (OCR) for degraded document image analysis has been adopted in this paper. The technique involves document imaging methods such as Image Fusing and Speeded Up Robust Features (SURF) Detection to identify and extract the degraded regions from a set of document images to obtain an original document with complete information. In case, degraded document image captured is skewed, it has to be straightened (deskew) to perform further process. A special format of image storing known as YCbCr is used as a tool to convert the Grayscale image to RGB image format. The presented algorithm is tested on various types of degraded documents such as printed documents, handwritten documents, old script documents and handwritten image sketches in documents. The purpose of this research is to obtain an original document for a given set of degraded documents of the same source.

Keywords: Grayscale image format, image fusing, SURF detection, YCbCr image format.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1155
6285 Attribute Selection Methods Comparison for Classification of Diffuse Large B-Cell Lymphoma

Authors: Helyane Bronoski Borges, Júlio Cesar Nievola

Abstract:

The most important subtype of non-Hodgkin-s lymphoma is the Diffuse Large B-Cell Lymphoma. Approximately 40% of the patients suffering from it respond well to therapy, whereas the remainder needs a more aggressive treatment, in order to better their chances of survival. Data Mining techniques have helped to identify the class of the lymphoma in an efficient manner. Despite that, thousands of genes should be processed to obtain the results. This paper presents a comparison of the use of various attribute selection methods aiming to reduce the number of genes to be searched, looking for a more effective procedure as a whole.

Keywords: Attribute selection, data mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1417
6284 Despiking of Turbulent Flow Data in Gravel Bed Stream

Authors: Ratul Das

Abstract:

The present experimental study insights the decontamination of instantaneous velocity fluctuations captured by Acoustic Doppler Velocimeter (ADV) in gravel-bed streams to ascertain near-bed turbulence for low Reynolds number. The interference between incidental and reflected pulses produce spikes in the ADV data especially in the near-bed flow zone and therefore filtering the data are very essential. Nortek’s Vectrino four-receiver ADV probe was used to capture the instantaneous three-dimensional velocity fluctuations over a non-cohesive bed. A spike removal algorithm based on the acceleration threshold method was applied to note the bed roughness and its influence on velocity fluctuations and velocity power spectra in the carrier fluid. The velocity power spectra of despiked signals with a best combination of velocity threshold (VT) and acceleration threshold (AT) are proposed which ascertained velocity power spectra a satisfactory fit with the Kolmogorov “–5/3 scaling-law” in the inertial sub-range. Also, velocity distributions below the roughness crest level fairly follows a third-degree polynomial series.

Keywords: Acoustic Doppler Velocimeter, gravel-bed, spike removal, Reynolds shear stress, near-bed turbulence, velocity power spectra.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1179
6283 Towards a Framework for Embedded Weight Comparison Algorithm with Business Intelligence in the Plantation Domain

Authors: M. Pushparani, A. Sagaya

Abstract:

Embedded systems have emerged as important elements in various domains with extensive applications in automotive, commercial, consumer, healthcare and transportation markets, as there is emphasis on intelligent devices. On the other hand, Business Intelligence (BI) has also been extensively used in a range of applications, especially in the agriculture domain which is the area of this research. The aim of this research is to create a framework for Embedded Weight Comparison Algorithm with Business Intelligence (EWCA-BI). The weight comparison algorithm will be embedded within the plantation management system and the weighbridge system. This algorithm will be used to estimate the weight at the site and will be compared with the actual weight at the plantation. The algorithm will be used to build the necessary alerts when there is a discrepancy in the weight, thus enabling better decision making. In the current practice, data are collected from various locations in various forms. It is a challenge to consolidate data to obtain timely and accurate information for effective decision making. Adding to this, the unstable network connection leads to difficulty in getting timely accurate information. To overcome the challenges embedding is done on a portable device that will have the embedded weight comparison algorithm to also assist in data capture and synchronize data at various locations overcoming the network short comings at collection points. The EWCA-BI will provide real-time information at any given point of time, thus enabling non-latent BI reports that will provide crucial information to enable efficient operational decision making. This research has a high potential in bringing embedded system into the agriculture industry. EWCA-BI will provide BI reports with accurate information with uncompromised data using an embedded system and provide alerts, therefore, enabling effective operation management decision-making at the site.

Keywords: Embedded business intelligence, weight comparison algorithm, oil palm plantation, embedded systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1182
6282 Human Resource Management Practices, Person-Environment Fit and Financial Performance in Brazilian Publicly Traded Companies

Authors: Bruno Henrique Rocha Fernandes, Amir Rezaee, Jucelia Appio

Abstract:

The relation between Human Resource Management (HRM) practices and organizational performance remains the subject of substantial literature. Though many studies demonstrated positive relationship, still major influencing variables are not yet clear. This study considers the Person-Environment Fit (PE Fit) and its components, Person-Supervisor (PS), Person-Group (PG), Person-Organization (PO) and Person-Job (PJ) Fit, as possible explanatory variables. We analyzed PE Fit as a moderator between HRM practices and financial performance in the “best companies to work” in Brazil. Data from HRM practices were classified through the High Performance Working Systems (HPWS) construct and data on PE-Fit were obtained through surveys among employees. Financial data, consisting of return on invested capital (ROIC) and price earnings ratio (PER) were collected for publicly traded best companies to work. Findings show that PO Fit and PJ Fit play a significant moderator role for PER but not for ROIC.

Keywords: Financial performance, human resource management, high performance working systems, person-environment fit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1353
6281 An Assessment of Water and Sediment Quality of the Danube River: Polycyclic Aromatic Hydrocarbons and Trace Metals

Authors: A. Szabó Nagy, J. Szabó, I. Vass

Abstract:

Water and sediment samples from the Danube River and Moson Danube Arm (Hungary) have been collected and analyzed for contamination by 18 polycyclic aromatic hydrocarbons (PAHs) and eight trace metal(loid)s (As, Cu, Pb, Ni, Cr, Cd, Hg and Zn) in the period of 2014-2015. Moreover, the trace metal(loid) concentrations were measured in the Rába and Marcal rivers (parts of the tributary system feeding the Danube). Total PAH contents in water were found to vary from 0.016 to 0.133 µg/L and concentrations in sediments varied in the range of 0.118 mg/kg and 0.283 mg/kg. Source analysis of PAHs using diagnostic concentration ratios indicated that PAHs found in sediments were of pyrolytic origins. The dissolved trace metal and arsenic concentrations were relatively low in the surface waters. However, higher concentrations were detected in the water samples of Rába (Zn, Cu, Ni, Pb) and Marcal (As, Cu, Ni, Pb) compared to the Danube and Moson Danube. The concentrations of trace metals in sediments were higher than those found in water samples.

Keywords: Surface water, sediment, PAH, trace metal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 778
6280 Effect of Rotor to Casing Ratios with Different Rotor Vanes on Performance of Shaft Output of a Vane Type Novel Air Turbine

Authors: Bharat Raj Singh, Onkar Singh

Abstract:

This paper deals with new concept of using compressed atmospheric air as a zero pollution power source for running motorbikes. The motorbike is equipped with an air turbine in place of an internal combustion engine, and transforms the energy of the compressed air into shaft work. The mathematical modeling and performance evaluation of a small capacity compressed air driven vaned type novel air turbine is presented in this paper. The effect of isobaric admission and adiabatic expansion of high pressure air for different rotor to casing diameter ratios with respect to different vane angles (number of vanes) have been considered and analyzed. It is found that the shaft work output is optimum for some typical values of rotor / casing diameter ratios at a particular value of vane angle (no. of vanes). In this study, the maximum power is obtained as 4.5kW - 5.3kW (5.5-6.25 HP) when casing diameter is taken 100 mm, and rotor to casing diameter ratios are kept from 0.65 to 0.55. This value of output is sufficient to run motorbike.

Keywords: zero pollution, compressed air, air turbine, vane angle, rotor / casing diameter ratio

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1463
6279 RS Based SCADA System for Longer Distance Powered Devices

Authors: Harkishen Singh, Gavin Mangeni

Abstract:

This project aims at building an efficient and automatic power monitoring SCADA system, which is capable of monitoring the electrical parameters of high voltage powered devices in real time for example RMS voltage and current, frequency, energy consumed, power factor etc. The system uses RS-485 serial communication interface to transfer data over longer distances. Embedded C programming is the platform used to develop two hardware modules namely: RTU and Master Station modules, which both use the CC2540 BLE 4.0 microcontroller configured in slave / master mode. The Si8900 galvanic ally isolated microchip is used to perform ADC externally. The hardware communicates via UART port and sends data to the user PC using the USB port. Labview software is used to design a user interface to display current state of the power loads being monitored as well as logs data to excel spreadsheet file. An understanding of the Si8900’s auto baud rate process is key to successful implementation of this project.

Keywords: SCADA, RS485, CC2540, Labview, Si8900.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1477
6278 Bridge Health Monitoring: A Review

Authors: Mohammad Bakhshandeh

Abstract:

Structural Health Monitoring (SHM) is a crucial and necessary practice that plays a vital role in ensuring the safety and integrity of critical structures, and in particular, bridges. The continuous monitoring of bridges for signs of damage or degradation through Bridge Health Monitoring (BHM) enables early detection of potential problems, allowing for prompt corrective action to be taken before significant damage occurs. Although all monitoring techniques aim to provide accurate and decisive information regarding the remaining useful life, safety, integrity, and serviceability of bridges, understanding the development and propagation of damage is vital for maintaining uninterrupted bridge operation. Over the years, extensive research has been conducted on BHM methods, and experts in the field have increasingly adopted new methodologies. In this article, we provide a comprehensive exploration of the various BHM approaches, including sensor-based, non-destructive testing (NDT), model-based, and artificial intelligence (AI)-based methods. We also discuss the challenges associated with BHM, including sensor placement and data acquisition, data analysis and interpretation, cost and complexity, and environmental effects, through an extensive review of relevant literature and research studies. Additionally, we examine potential solutions to these challenges and propose future research ideas to address critical gaps in BHM.

Keywords: Structural health monitoring, bridge health monitoring, sensor-based methods, machine-learning algorithms, model-based techniques, sensor placement, data acquisition, data analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 302
6277 Parallelization and Optimization of SIFT Feature Extraction on Cluster System

Authors: Mingling Zheng, Zhenlong Song, Ke Xu, Hengzhu Liu

Abstract:

Scale Invariant Feature Transform (SIFT) has been widely applied, but extracting SIFT feature is complicated and time-consuming. In this paper, to meet the demand of the real-time applications, SIFT is parallelized and optimized on cluster system, which is named pSIFT. Redundancy storage and communication are used for boundary data to improve the performance, and before representation of feature descriptor, data reallocation is adopted to keep load balance in pSIFT. Experimental results show that pSIFT achieves good speedup and scalability.

Keywords: cluster, image matching, parallelization and optimization, SIFT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1863
6276 Data Traffic Dynamics and Saturation on a Single Link

Authors: Reginald D. Smith

Abstract:

The dynamics of User Datagram Protocol (UDP) traffic over Ethernet between two computers are analyzed using nonlinear dynamics which shows that there are two clear regimes in the data flow: free flow and saturated. The two most important variables affecting this are the packet size and packet flow rate. However, this transition is due to a transcritical bifurcation rather than phase transition in models such as in vehicle traffic or theorized large-scale computer network congestion. It is hoped this model will help lay the groundwork for further research on the dynamics of networks, especially computer networks.

Keywords: congestion, packet flow, Internet, traffic dynamics, transcritical bifurcation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1615
6275 Distributed Automation System Based Remote Monitoring of Power Quality Disturbance on LV Network

Authors: Emmanuel D. Buedi, K. O. Boateng, Griffith S. Klogo

Abstract:

Electrical distribution networks are prone to power quality disturbances originating from the complexity of the distribution network, mode of distribution (overhead or underground) and types of loads used by customers. Data on the types of disturbances present and frequency of occurrence is needed for economic evaluation and hence finding solution to the problem. Utility companies have resorted to using secondary power quality devices such as smart meters to help gather the required data. Even though this approach is easier to adopt, data gathered from these devices may not serve the required purpose, since the installation of these devices in the electrical network usually does not conform to available PQM placement methods. This paper presents a design of a PQM that is capable of integrating into an existing DAS infrastructure to take advantage of available placement methodologies. The monitoring component of the design is implemented and installed to monitor an existing LV network. Data from the monitor is analyzed and presented. A portion of the LV network of the Electricity Company of Ghana is modeled in MATLAB-Simulink and analyzed under various earth fault conditions. The results presented show the ability of the PQM to detect and analyze PQ disturbance such as voltage sag and overvoltage. By adopting a placement methodology and installing these nodes, utilities are assured of accurate and reliable information with respect to the quality of power delivered to consumers.

Keywords: Power quality, remote monitoring, distributed automation system, economic evaluation, LV network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1137
6274 Production Optimization through Ejector Installation at ESA Platform Offshore North West Java Field

Authors: Arii Bowo Yudhaprasetya, Ario Guritno, Agus Setiawan, Recky Tehupuring, Cosmas Supriatna

Abstract:

The offshore facilities condition of Pertamina Hulu Energi Offshore North West Java (PHE ONWJ) varies greatly from place to place, depending on the characteristics of the presently installed facilities. In some locations, such as ESA platform, gas trap is mainly caused by the occurrence of flash gas phenomenon which is known as mechanical-physical separation process of multiphase flow. Consequently, the presence of gas trap at main oil line would accumulate on certain areas result in a reduced oil stream throughout the pipeline. Any presence of discrete gaseous along continuous oil flow represents a unique flow condition under certain specific volume fraction and velocity field. From gas lift source, a benefit line is used as a motive flow for ejector which is designed to generate a syphon effect to minimize the gas trap phenomenon. Therefore, the ejector’s exhaust stream will flow to the designated point without interfering other systems.

Keywords: Ejector, diffuser, multiphase flow, syphon effects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 965
6273 Understanding Physical Activity Behavior of Type 2 Diabetics Using the Theory of Planned Behavior and Structural Equation Modeling

Authors: D. O. Omondi, M. K. Walingo, G. M. Mbagaya, L. O. A. Othuon

Abstract:

Understanding patient factors related to physical activity behavior is important in the management of Type 2 Diabetes. This study applied the Theory of Planned Behavior model to understand physical activity behavior among sampled Type 2 diabetics in Kenya. The study was conducted within the diabetic clinic at Kisii Level 5 Hospital and adopted sequential mixed methods design beginning with qualitative phase and ending with quantitative phase. Qualitative data was analyzed using grounded theory analysis method. Structural equation modeling using maximum likelihood was used to analyze quantitative data. The common fit indices revealed that the theory of planned behavior fitted the data acceptably well among the Type 2 diabetes and within physical activity behavior {¤ç2 = 213, df = 84, n=230, p = .061, ¤ç2/df = 2.53; TLI = .97; CFI =.96; RMSEA (90CI) = .073(.029, .08)}. This theory proved to be useful in understanding physical activity behavior among Type 2 diabetics.

Keywords: Physical activity, Theory of Planned Behavior, Type2 diabetes, Kenya.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1984