Search results for: data block
24530 Employing a Knime-based and Open-source Tools to Identify AMI and VER Metabolites from UPLC-MS Data
Authors: Nouf Alourfi
Abstract:
This study examines the metabolism of amitriptyline (AMI) and verapamil (VER) using a KNIME-based method. KNIME improved workflow is an open-source data-analytics platform that integrates a number of open-source metabolomics tools such as CFMID and MetFrag to provide standard data visualisations, predict candidate metabolites, assess them against experimental data, and produce reports on identified metabolites. The use of this workflow is demonstrated by employing three types of liver microsomes (human, rat, and Guinea pig) to study the in vitro metabolism of the two drugs (AMI and VER). This workflow is used to create and treat UPLC-MS (Orbitrap) data. The formulas and structures of these drugs' metabolites can be assigned automatically. The key metabolic routes for amitriptyline are hydroxylation, N-dealkylation, N-oxidation, and conjugation, while N-demethylation, O-demethylation and N-dealkylation, and conjugation are the primary metabolic routes for verapamil. The identified metabolites are compatible to the published, clarifying the solidity of the workflow technique and the usage of computational tools like KNIME in supporting the integration and interoperability of emerging novel software packages in the metabolomics area.Keywords: KNIME, CFMID, MetFrag, Data Analysis, Metabolomics
Procedia PDF Downloads 11724529 The Depth Penetration of Beryllium-7, ⁷BE as a Tracer in the Sembrong Catchment Area Study
Authors: J. Sharib, D. N. A. Tugi, M. T. Ishak, M. I. A. Adziz
Abstract:
The main purpose of this research paper conducted was to study the penetration of ⁷Be onto the soil surface for two different seasons in different areas of agricultural activity. The study was conducted during the dry and wet seasons from January to May 2019 in the Sembrong catchment area. The Sembrong Catchment Area is located in the district of Kluang, Johor in the South of Peninsular Malaysia and was selected based on the small size of the catchment and surrounded by various agricultural activities. A total of twenty (20) core soil samples to a depth of 10 cm each were taken using a metal corer made of metal. All these samples were brought to the Radiochemistry and Environment Group (RAS), Nuclear Malaysia, Block 23, Bangi, Malaysia, to enable the preparation, drying and analysis work to be carried out. Furthermore, all samples were oven dried at 45 – 60 ºC so that the dry weight became constant and gently disaggregated. Lastly, dried samples were milled and sieved at 2 mm before being packed into a well-type container and ready for ⁷Be analysis. The result of the analysis shows that the penetration of ⁷Be into the soil surface decreases by an exponential decay. The distribution of profiles to the interior of the soil surface or ho values ranged from 1.56 to 3.62 kg m⁻² and from 2.59 to 4.17 kg m⁻² for both dry and wet seasons. Consequently, the dry season has given a lower ho value when compared to the wet season. In conclusion, ⁷Be is a very suitable tracer to be used in determining the penetration onto the soil surface or ho values for the two different seasons.Keywords: depth penetration, dry season, wet season, sembrong catchment, well type container
Procedia PDF Downloads 12524528 GIS for Simulating Air Traffic by Applying Different Multi-radar Positioning Techniques
Authors: Amara Rafik, Bougherara Maamar, Belhadj Aissa Mostefa
Abstract:
Radar data is one of the many data sources used by ATM Air Traffic Management systems. These data come from air navigation radar antennas. These radars intercept signals emitted by the various aircraft crossing the controlled airspace and calculate the position of these aircraft and retransmit their positions to the Air Traffic Management System. For greater reliability, these radars are positioned in such a way as to allow their coverage areas to overlap. An aircraft will therefore be detected by at least one of these radars. However, the position coordinates of the same aircraft and sent by these different radars are not necessarily identical. Therefore, the ATM system must calculate a single position (radar track) which will ultimately be sent to the control position and displayed on the air traffic controller's monitor. There are several techniques for calculating the radar track. Furthermore, the geographical nature of the problem requires the use of a Geographic Information System (GIS), i.e. a geographical database on the one hand and geographical processing. The objective of this work is to propose a GIS for traffic simulation which reconstructs the evolution over time of aircraft positions from a multi-source radar data set and by applying these different techniques.Keywords: ATM, GIS, radar data, air traffic simulation
Procedia PDF Downloads 8324527 Integrating of Multi-Criteria Decision Making and Spatial Data Warehouse in Geographic Information System
Authors: Zohra Mekranfar, Ahmed Saidi, Abdellah Mebrek
Abstract:
This work aims to develop multi-criteria decision making (MCDM) and spatial data warehouse (SDW) methods, which will be integrated into a GIS according to a ‘GIS dominant’ approach. The GIS operating tools will be operational to operate the SDW. The MCDM methods can provide many solutions to a set of problems with various and multiple criteria. When the problem is so complex, integrating spatial dimension, it makes sense to combine the MCDM process with other approaches like data mining, ascending analyses, we present in this paper an experiment showing a geo-decisional methodology of SWD construction, On-line analytical processing (OLAP) technology which combines both basic multidimensional analysis and the concepts of data mining provides powerful tools to highlight inductions and information not obvious by traditional tools. However, these OLAP tools become more complex in the presence of the spatial dimension. The integration of OLAP with a GIS is the future geographic and spatial information solution. GIS offers advanced functions for the acquisition, storage, analysis, and display of geographic information. However, their effectiveness for complex spatial analysis is questionable due to their determinism and their decisional rigor. A prerequisite for the implementation of any analysis or exploration of spatial data requires the construction and structuring of a spatial data warehouse (SDW). This SDW must be easily usable by the GIS and by the tools offered by an OLAP system.Keywords: data warehouse, GIS, MCDM, SOLAP
Procedia PDF Downloads 17524526 Using Open Source Data and GIS Techniques to Overcome Data Deficiency and Accuracy Issues in the Construction and Validation of Transportation Network: Case of Kinshasa City
Authors: Christian Kapuku, Seung-Young Kho
Abstract:
An accurate representation of the transportation system serving the region is one of the important aspects of transportation modeling. Such representation often requires developing an abstract model of the system elements, which also requires important amount of data, surveys and time. However, in some cases such as in developing countries, data deficiencies, time and budget constraints do not always allow such accurate representation, leaving opportunities to assumptions that may negatively affect the quality of the analysis. With the emergence of Internet open source data especially in the mapping technologies as well as the advances in Geography Information System, opportunities to tackle these issues have raised. Therefore, the objective of this paper is to demonstrate such application through a practical case of the development of the transportation network for the city of Kinshasa. The GIS geo-referencing was used to construct the digitized map of Transportation Analysis Zones using available scanned images. Centroids were then dynamically placed at the center of activities using an activities density map. Next, the road network with its characteristics was built using OpenStreet data and other official road inventory data by intersecting their layers and cleaning up unnecessary links such as residential streets. The accuracy of the final network was then checked, comparing it with satellite images from Google and Bing. For the validation, the final network was exported into Emme3 to check for potential network coding issues. Results show a high accuracy between the built network and satellite images, which can mostly be attributed to the use of open source data.Keywords: geographic information system (GIS), network construction, transportation database, open source data
Procedia PDF Downloads 16624525 Analysis of Business Intelligence Tools in Healthcare
Authors: Avishkar Gawade, Omkar Bansode, Ketan Bhambure, Bhargav Deore
Abstract:
In recent year wide range of business intelligence technology have been applied to different area in order to support decision making process BI enables extraction of knowledge from data store. BI tools usually used in public health field for financial and administrative purposes.BI uses a dashboard in presentation stage to deliver information to information to end users.In this paper,we intend to analyze some open source BI tools on the market and their applicability in the clinical sphere taking into consideration the general characteristics of the clinical environment.A pervasive BI platform was developed using a real case in order to prove the tool viability.Analysis of various BI Tools in done with the help of several parameters such as data security,data integration,data quality reporting and anlaytics,performance,scalability and cost effectivesness.Keywords: CDSS, EHR, business intelliegence, tools
Procedia PDF Downloads 13524524 Comparative Analysis of Different Land Use Land Cover (LULC) Maps in WRF Modelling Over Indian Region
Authors: Sen Tanmoy, Jain Sarika, Panda Jagabandhu
Abstract:
The studies regarding the impact of urbanization using the WRF-ARW model rely heavily on the static geographical information selected, including domain configuration and land use land cover (LULC) data. Accurate representation of LULC data provides essential information for understanding urban growth and simulating meteorological parameters such as temperature, precipitation etc. Researchers are using different LULC data as per availability and their requirements. As far as India is concerned, we have very limited resources and data availability. So, it is important to understand how we can optimize our results using limited LULC data. In this review article, we explored how a LULC map is generated from different sources in the Indian context and what its significance is in WRF-ARW modeling to study urbanization/Climate change or any other meteorological parameters. Bibliometric analyses were also performed in this review article based on countries of study and indexed keywords. Finally, some key points are marked out for selecting the most suitable LULC map for any urbanization-related study.Keywords: LULC, LULC mapping, LANDSAT, WRF-ARW, ISRO, bibliometric Analysis.
Procedia PDF Downloads 2724523 Effect of Cocoa Pod Ash and Poultry Manure on Soil Properties and Cocoyam Productivity of Nutrient-Depleted Tropical Alfisol
Authors: T. M. Agbede, A. O. Adekiya
Abstract:
An experiment was carried out for three consecutive years at Owo, southwest Nigeria. The objective of the investigation was to determine the effect of Cocoa Pod Ash (CPA) and Poultry Manure (PM) applied solely and their combined form, as sources of fertilizers on soil properties, leaf nutrient composition, growth and yield of cocoyam. Three soil amendments: CPA, PM (sole forms), CPA and PM (mixture), were applied at 7.5 t ha-1 with an inorganic fertilizer (NPK 15-15-15) at 400 kg ha-1 as a reference and a natural soil fertility, NSF (control), arranged in a randomized complete block design with three replications. Results showed that soil amendments significantly increased (p = 0.05) corm and cormel weights and growth of cocoyam, soil and leaf N, P, K, Ca and Mg, soil pH and organic carbon (OC) concentrations compared with the NSF (control). The mixture of CPA+PM treatment increased corm and cormel weights, plant height and leaf area of cocoyam by 40, 39, 42, and 48%, respectively, compared with inorganic fertilizer (NPK) and 13, 12, 15 and 7%, respectively, compared with PM alone. Sole or mixed forms of soil amendments showed remarkable improvement in soil physical properties compared with NPK and the NSF (control). The mixture of CPA+PM applied at 7.5 t ha-1 was the most effective treatment in improving cocoyam yield and growth parameters, soil and leaf nutrient composition.Keywords: Cocoa pod ash, cocoyam, poultry manure, soil and leaf nutrient composition.
Procedia PDF Downloads 37024522 Data Projects for “Social Good”: Challenges and Opportunities
Authors: Mikel Niño, Roberto V. Zicari, Todor Ivanov, Kim Hee, Naveed Mushtaq, Marten Rosselli, Concha Sánchez-Ocaña, Karsten Tolle, José Miguel Blanco, Arantza Illarramendi, Jörg Besier, Harry Underwood
Abstract:
One of the application fields for data analysis techniques and technologies gaining momentum is the area of social good or “common good”, covering cases related to humanitarian crises, global health care, or ecology and environmental issues, among others. The promotion of data-driven projects in this field aims at increasing the efficacy and efficiency of social initiatives, improving the way these actions help humanity in general and people in need in particular. This application field, however, poses its own barriers and challenges when developing data-driven projects, lagging behind in comparison with other scenarios. These challenges derive from aspects such as the scope and scale of the social issue to solve, cultural and political barriers, the skills of main stakeholders and the technological resources available, the motivation to be engaged in such projects, or the ethical and legal issues related to sensitive data. This paper analyzes the application of data projects in the field of social good, reviewing its current state and noteworthy initiatives, and presenting a framework covering the key aspects to analyze in such projects. The goal is to provide guidelines to understand the main challenges and opportunities for this type of data project, as well as identifying the main differential issues compared to “classical” data projects in general. A case study is presented on the initial steps and stakeholder analysis of a data project for the inclusion of refugees in the city of Frankfurt, Germany, in order to empirically confront the framework with a real example.Keywords: data-driven projects, humanitarian operations, personal and sensitive data, social good, stakeholders analysis
Procedia PDF Downloads 32524521 Slugging Frequency Correlation for High Viscosity Oil-Gas Flow in Horizontal Pipeline
Authors: B. Y. Danjuma, A. Archibong-Eso, Aliyu M. Aliyu, H. Yeung
Abstract:
In this experimental investigation, a new data for slugging frequency for high viscosity oil-gas flow are reported. Scale experiments were carried out using a mixture of air and mineral oil as the liquid phase in a 17 m long horizontal pipe with 0.0762 ID. The data set was acquired using two high-speed Gamma Densitometers at a data acquisition frequency of 250 Hz over a time interval of 30 seconds. For the range of flow conditions investigated, increase in liquid oil viscosity was observed to strongly influence the slug frequency. A comparison of the present data with prediction models available in the literature revealed huge discrepancies. A new correlation incorporating the effect of viscosity on slug frequency has been proposed for the horizontal flow, which represents the main contribution of this work.Keywords: gamma densitometer, flow pattern, pressure gradient, slug frequency
Procedia PDF Downloads 41024520 Transferring Data from Glucometer to Mobile Device via Bluetooth with Arduino Technology
Authors: Tolga Hayit, Ucman Ergun, Ugur Fidan
Abstract:
Being healthy is undoubtedly an indispensable necessity for human life. With technological improvements, in the literature, various health monitoring and imaging systems have been developed to satisfy your health needs. In this context, the work of monitoring and recording the data of individual health monitoring data via wireless technology is also being part of these studies. Nowadays, mobile devices which are located in almost every house and which become indispensable of our life and have wireless technology infrastructure have an important place of making follow-up health everywhere and every time because these devices were using in the health monitoring systems. In this study, Arduino an open-source microcontroller card was used in which a sample sugar measuring device was connected in series. In this way, the glucose data (glucose ratio, time) obtained with the glucometer is transferred to the mobile device based on the Android operating system with the Bluetooth technology channel. A mobile application was developed using the Apache Cordova framework for listing data, presenting graphically and reading data over Arduino. Apache Cordova, HTML, Javascript and CSS are used in coding section. The data received from the glucometer is stored in the local database of the mobile device. It is intended that people can transfer their measurements to their mobile device by using wireless technology and access the graphical representations of their data. In this context, the aim of the study is to be able to perform health monitoring by using different wireless technologies in mobile devices that can respond to different wireless technologies at present. Thus, that will contribute the other works done in this area.Keywords: Arduino, Bluetooth, glucose measurement, mobile health monitoring
Procedia PDF Downloads 32124519 Analyzing the Risk Based Approach in General Data Protection Regulation: Basic Challenges Connected with Adapting the Regulation
Authors: Natalia Kalinowska
Abstract:
The adoption of the General Data Protection Regulation, (GDPR) finished the four-year work of the European Commission in this area in the European Union. Considering far-reaching changes, which will be applied by GDPR, the European legislator envisaged two-year transitional period. Member states and companies have to prepare for a new regulation until 25 of May 2018. The idea, which becomes a new look at an attitude to data protection in the European Union is risk-based approach. So far, as a result of implementation of Directive 95/46/WE, in many European countries (including Poland) there have been adopted very particular regulations, specifying technical and organisational security measures e.g. Polish implementing rules indicate even how long password should be. According to the new approach from May 2018, controllers and processors will be obliged to apply security measures adequate to level of risk associated with specific data processing. The risk in GDPR should be interpreted as the likelihood of a breach of the rights and freedoms of the data subject. According to Recital 76, the likelihood and severity of the risk to the rights and freedoms of the data subject should be determined by reference to the nature, scope, context and purposes of the processing. GDPR does not indicate security measures which should be applied – in recitals there are only examples such as anonymization or encryption. It depends on a controller’s decision what type of security measures controller considered as sufficient and he will be responsible if these measures are not sufficient or if his identification of risk level is incorrect. Data protection regulation indicates few levels of risk. Recital 76 indicates risk and high risk, but some lawyers think, that there is one more category – low risk/now risk. Low risk/now risk data processing is a situation when it is unlikely to result in a risk to the rights and freedoms of natural persons. GDPR mentions types of data processing when a controller does not have to evaluate level of risk because it has been classified as „high risk” processing e.g. processing on a large scale of special categories of data, processing with using new technologies. The methodology will include analysis of legal regulations e.g. GDPR, the Polish Act on the Protection of personal data. Moreover: ICO Guidelines and articles concerning risk based approach in GDPR. The main conclusion is that an appropriate risk assessment is a key to keeping data safe and avoiding financial penalties. On the one hand, this approach seems to be more equitable, not only for controllers or processors but also for data subjects, but on the other hand, it increases controllers’ uncertainties in the assessment which could have a direct impact on incorrect data protection and potential responsibility for infringement of regulation.Keywords: general data protection regulation, personal data protection, privacy protection, risk based approach
Procedia PDF Downloads 25124518 Cognitive Behavioral Modification in the Treatment of Aggressive Behavior in Children
Authors: Dijana Sulejmanović
Abstract:
Cognitive-behavioral modification (CBM) is a combination of cognitive and behavioral learning principles to shape and encourage the desired behaviors. A crucial element of cognitive-behavioral modification is that a change the behavior precedes awareness of how it affects others. CBM is oriented toward changing inner speech and learning to control behaviors through self-regulation techniques. It aims to teach individuals how to develop the ability to recognize, monitor and modify their thoughts, feelings, and behaviors. The review of literature emphasizes the efficiency the CBM approach in the treatment of children's hyperactivity and negative emotions such as anger. The results of earlier research show how impulsive and hyperactive behavior, agitation, and aggression may slow down and block the child from being able to actively monitor and participate in regular classes, resulting in the disruption of the classroom and the teaching process, and the children may feel rejected, isolated and develop long-term poor image of themselves and others. In this article, we will provide how the use of CBM, adapted to child's age, can incorporate measures of cognitive and emotional functioning which can help us to better understand the children’s cognitive processes, their cognitive strengths, and weaknesses, and to identify factors that may influence their behavioral and emotional regulation. Such a comprehensive evaluation can also help identify cognitive and emotional risk factors associated with aggressive behavior, specifically the processes involved in modulating and regulating cognition and emotions.Keywords: aggressive behavior, cognitive behavioral modification, cognitive behavioral theory, modification
Procedia PDF Downloads 32424517 UAV’s Enhanced Data Collection for Heterogeneous Wireless Sensor Networks
Authors: Kamel Barka, Lyamine Guezouli, Assem Rezki
Abstract:
In this article, we propose a protocol called DataGA-DRF (a protocol for Data collection using a Genetic Algorithm through Dynamic Reference Points) that collects data from Heterogeneous wireless sensor networks. This protocol is based on DGA (Destination selection according to Genetic Algorithm) to control the movement of the UAV (Unmanned aerial vehicle) between dynamic reference points that virtually represent the sensor node deployment. The dynamics of these points ensure an even distribution of energy consumption among the sensors and also improve network performance. To determine the best points, DataGA-DRF uses a classification algorithm such as K-Means.Keywords: heterogeneous wireless networks, unmanned aerial vehicles, reference point, collect data, genetic algorithm
Procedia PDF Downloads 8124516 Evaluating Environmental Impact of End-of-Life Cycle Cases for Brick Walls and Aerated Autoclave Concrete Walls
Authors: Ann Mariya Jose, Ashfina T.
Abstract:
Construction and demolition waste is one of the rising concerns globally due to the amount of waste generated annually, the area taken up by landfills, and the adverse environmental impacts that follow. One of the primary causes of the rise in construction and demolition waste is a lack of facilities and knowledge for incorporating recycled materials into new construction. Bricks are a conventional material that has been used for construction for centuries, and Autoclave Aerated Concrete (AAC) blocks are a new emergent material in the market. This study evaluates the impact brick walls, and AAC block walls have on the environment using the tool One Click LCA, considering three End of Life (EoL) scenarios: the materials are landfilled, recycled, and reused in a new building. The final objective of the study is to evaluate the environmental impact caused by these two different walls on the environmental factors such as Global Warming Potential (GWP), Acidification Potential (AP), Eutrophication Potential (EP), Ozone Depletion Potential (ODP), and Photochemical Ozone Creation Potential (POCP). The findings revealed that the GWP caused by landfilling is 16 times higher in bricks and 22 times higher in AAC blocks when compared to the reuse of materials. The study recommends the effective use of AAC blocks in construction and reuse of the same to reduce the overall emissions to the environment.Keywords: construction and demolition waste, environmental impact, life cycle impact assessment, material recycling
Procedia PDF Downloads 10224515 HPPDFIM-HD: Transaction Distortion and Connected Perturbation Approach for Hierarchical Privacy Preserving Distributed Frequent Itemset Mining over Horizontally-Partitioned Dataset
Authors: Fuad Ali Mohammed Al-Yarimi
Abstract:
Many algorithms have been proposed to provide privacy preserving in data mining. These protocols are based on two main approaches named as: the perturbation approach and the Cryptographic approach. The first one is based on perturbation of the valuable information while the second one uses cryptographic techniques. The perturbation approach is much more efficient with reduced accuracy while the cryptographic approach can provide solutions with perfect accuracy. However, the cryptographic approach is a much slower method and requires considerable computation and communication overhead. In this paper, a new scalable protocol is proposed which combines the advantages of the perturbation and distortion along with cryptographic approach to perform privacy preserving in distributed frequent itemset mining on horizontally distributed data. Both the privacy and performance characteristics of the proposed protocol are studied empirically.Keywords: anonymity data, data mining, distributed frequent itemset mining, gaussian perturbation, perturbation approach, privacy preserving data mining
Procedia PDF Downloads 50324514 Experimental Investigation of Air-Water Two-Phase Flow Pattern in T-Junction Microchannel
Authors: N. Rassoul-ibrahim, E. Siahmed, L. Tadrist
Abstract:
Water management plays a crucial role in the performance and durability of PEM fuel cells. Whereas the membrane must be hydrated enough, liquid droplets formed by water in excess can block the flow in the gas distribution channels and hinder the fuel cell performance. The main purpose of this work is to increase the understanding of liquid transport and mixing through mini- or micro-channels for various engineering or medical process applications including cool-ing of equipment according to the operations considered. For that purpose and as a first step, a technique was devel-oped to automatically detect and characterize two-phase flow patterns that may appear in such. The investigation, mainly experimental, was conducted on transparent channel with a 1mm x 1mm square cross section and a 0.3mm x 0.3 mm water injection normal to the gas channel. Three main flow patterns were identified liquid slug, bubble flow and annular flow. A flow map has been built accord-ing to the flow rate of both phases. As a sample the follow-ing figures show representative images of the flow struc-tures observed. An analysis and discussion of the flow pattern, in mini-channel, will be provided and compared to the case old micro-channel. . Keywords: Two phase flow, Clean Energy, Minichannels, Fuel Cells. Flow patterns, Maps.Keywords: two phase flox, T-juncion, Micro and minichannels, clean energy, flow patterns, maps
Procedia PDF Downloads 7424513 Implementation of Data Science in Field of Homologation
Authors: Shubham Bhonde, Nekzad Doctor, Shashwat Gawande
Abstract:
For the use and the import of Keys and ID Transmitter as well as Body Control Modules with radio transmission in a lot of countries, homologation is required. Final deliverables in homologation of the product are certificates. In considering the world of homologation, there are approximately 200 certificates per product, with most of the certificates in local languages. It is challenging to manually investigate each certificate and extract relevant data from the certificate, such as expiry date, approval date, etc. It is most important to get accurate data from the certificate as inaccuracy may lead to missing re-homologation of certificates that will result in an incompliance situation. There is a scope of automation in reading the certificate data in the field of homologation. We are using deep learning as a tool for automation. We have first trained a model using machine learning by providing all country's basic data. We have trained this model only once. We trained the model by feeding pdf and jpg files using the ETL process. Eventually, that trained model will give more accurate results later. As an outcome, we will get the expiry date and approval date of the certificate with a single click. This will eventually help to implement automation features on a broader level in the database where certificates are stored. This automation will help to minimize human error to almost negligible.Keywords: homologation, re-homologation, data science, deep learning, machine learning, ETL (extract transform loading)
Procedia PDF Downloads 16024512 Additive Weibull Model Using Warranty Claim and Finite Element Analysis Fatigue Analysis
Authors: Kanchan Mondal, Dasharath Koulage, Dattatray Manerikar, Asmita Ghate
Abstract:
This paper presents an additive reliability model using warranty data and Finite Element Analysis (FEA) data. Warranty data for any product gives insight to its underlying issues. This is often used by Reliability Engineers to build prediction model to forecast failure rate of parts. But there is one major limitation in using warranty data for prediction. Warranty periods constitute only a small fraction of total lifetime of a product, most of the time it covers only the infant mortality and useful life zone of a bathtub curve. Predicting with warranty data alone in these cases is not generally provide results with desired accuracy. Failure rate of a mechanical part is driven by random issues initially and wear-out or usage related issues at later stages of the lifetime. For better predictability of failure rate, one need to explore the failure rate behavior at wear out zone of a bathtub curve. Due to cost and time constraints, it is not always possible to test samples till failure, but FEA-Fatigue analysis can provide the failure rate behavior of a part much beyond warranty period in a quicker time and at lesser cost. In this work, the authors proposed an Additive Weibull Model, which make use of both warranty and FEA fatigue analysis data for predicting failure rates. It involves modeling of two data sets of a part, one with existing warranty claims and other with fatigue life data. Hazard rate base Weibull estimation has been used for the modeling the warranty data whereas S-N curved based Weibull parameter estimation is used for FEA data. Two separate Weibull models’ parameters are estimated and combined to form the proposed Additive Weibull Model for prediction.Keywords: bathtub curve, fatigue, FEA, reliability, warranty, Weibull
Procedia PDF Downloads 7224511 Impact of Compost Application with Different Rates of Chemical Fertilizers on Corn Growth and Production
Authors: Reda Abdel-Aziz
Abstract:
Agricultural activities in Egypt generate annually around 35 million tons of waste. Composting is one of the most promising technologies to turnover waste in a more economical way, for many centuries. Composting has been used as a mean of recycling organic matter back into the soil to improve soil structure and fertility. Field experiments were conducted in two governorates, Giza and Al-Monofia, to find out the effect of compost with different rates of chemical fertilizers on growth and yield of corn (Zea mays L.) during two constitutive seasons of 2012 and 2013. The experiment, laid out in a randomized complete block design (RCBD), was carried out on five farmers’ fields in each governorate. The treatments were: unfertilized control, full dose of NPK (120, 30, and 50 kg/acre, respectively), compost at rate of 20 ton/acre, compost at rate of 10 ton/acre + 25% of chemical fertilizer, compost at rate of 10 ton/acre + 50% of chemical fertilizer and compost at rate of 10 ton/acre + 75% of chemical fertilizer. Results revealed a superiority of the treatment of compost at rate of 10 ton/acre + 50% of NPK that caused significant improvement in growth, yield and nutrient uptakes of corn in the two governorates during the two constitutive seasons. Results showed that agricultural waste could be composted into value added soil amendment to enhance efficiency of chemical fertilizer. Composting of agricultural waste could also reduce the chemical fertilizers potential hazard to the environment.Keywords: agricultural waste, compost, chemical fertilizers, corn production, environment
Procedia PDF Downloads 31624510 Utilization of Sugar Factory Waste as an Organic Fertilizer on Growth and Production of Baby Corn
Authors: Marliana S. Palad
Abstract:
The research purpose is to view and know the influence of giving blotong against growth and production of baby corn. The research was arranged as a factorial experiment in completely randomized block design (RBD) with three replications. The first is fertilizer type: blotong (B1), blotong+EM4 (B2) and bokashi blotong (B3), while of the blotong dose assigned as the second factor: blotong 5 ton ha -1 (D1), blotong 10 ton ha-1 (D2) and blotong 15 ton ha-1 (D3). The research result indicated that bokashi blotong gives the best influence compare to blotong+EM4 against all parameters. Interaction between fertilizers does 10 ton ha-1 to the bokashi. Blotong gives the best influence to the baby corn production 4.41 ton ha-1, bokasi blotong best anyway influence on baby corn vegetative growth, that is: plant height 113.00 cm, leaves number 8 (eight) pieces and stem diameter 6.02 cm. Results of analysis of variance showed that giving of bokashi blotong (B3) showed a better effect on the growth and production of baby corn and highly significant for plant height age of 60 days after planting, leaf number aged 60 days after planting, cob length cornhusk and without cornhusk, diameter stems and cobs, cob weight with cornhusk and without cornhusk and production are converted into ton ha-1. This is due to bokashi blotong has organic content of C, N, P, and K totalling more than the maximum treatment blotong (B1) and the blotong+EM4 (B2). Based on the research result, it can be summarised that sugar factory waste called blotong can be used to make bokashi as organic fertilizer, so the baby corn can growth and production better.Keywords: blotong, bokashi, organic fertilizer, sugar factory waste
Procedia PDF Downloads 39424509 Data-Focused Digital Transformation for Smart Net-Zero Cities: A Systems Thinking Approach
Authors: Farzaneh Mohammadi Jouzdani, Vahid Javidroozi, Monica Mateo Garcia, Hanifa Shah
Abstract:
The emergence of developing smart net-zero cities in recent years has attracted significant attention and interest from worldwide communities and scholars as a potential solution to the critical requirement for urban sustainability. This research-in-progress paper aims to investigate the development of smart net-zero cities to propose a digital transformation roadmap for smart net-zero cities with a primary focus on data. Employing systems thinking as an underpinning theory, the study advocates for the necessity of utilising a holistic strategy for understanding the complex interdependencies and interrelationships that characterise urban systems. The proposed methodology will involve an in-depth investigation of current data-driven approaches in the smart net-zero city. This is followed by utilising predictive analysis methods to evaluate the holistic impact of the approaches on moving toward a Smart net-zero city. It is expected to achieve systemic intervention followed by a data-focused and systemic digital transformation roadmap for smart net-zero, contributing to a more holistic understanding of urban sustainability.Keywords: smart city, net-zero city, digital transformation, systems thinking, data integration, data-driven approach
Procedia PDF Downloads 2124508 Analysis of an Alternative Data Base for the Estimation of Solar Radiation
Authors: Graciela Soares Marcelli, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Claudineia Brazil, Rafael Haag
Abstract:
The sun is a source of renewable energy, and its use as both a source of heat and light is one of the most promising energy alternatives for the future. To measure the thermal or photovoltaic systems a solar irradiation database is necessary. Brazil still has a reduced number of meteorological stations that provide frequency tests, as an alternative to the radio data platform, with reanalysis systems, quite significant. ERA-Interim is a global fire reanalysis by the European Center for Medium-Range Weather Forecasts (ECMWF). The data assimilation system used for the production of ERA-Interim is based on a 2006 version of the IFS (Cy31r2). The system includes a 4-dimensional variable analysis (4D-Var) with a 12-hour analysis window. The spatial resolution of the dataset is approximately 80 km at 60 vertical levels from the surface to 0.1 hPa. This work aims to make a comparative analysis between the ERA-Interim data and the data observed in the Solarimmetric Atlas of the State of Rio Grande do Sul, to verify its applicability in the absence of an observed data network. The analysis of the results obtained for a study region as an alternative to the energy potential of a given region.Keywords: energy potential, reanalyses, renewable energy, solar radiation
Procedia PDF Downloads 16224507 Big Data Analytics and Public Policy: A Study in Rural India
Authors: Vasantha Gouri Prathapagiri
Abstract:
Innovations in ICT sector facilitate qualitative life style for citizens across the globe. Countries that facilitate usage of new techniques in ICT, i.e., big data analytics find it easier to fulfil the needs of their citizens. Big data is characterised by its volume, variety, and speed. Analytics involves its processing in a cost effective way in order to draw conclusion for their useful application. Big data also involves into the field of machine learning, artificial intelligence all leading to accuracy in data presentation useful for public policy making. Hence using data analytics in public policy making is a proper way to march towards all round development of any country. The data driven insights can help the government to take important strategic decisions with regard to socio-economic development of her country. Developed nations like UK and USA are already far ahead on the path of digitization with the support of Big Data analytics. India is a huge country and is currently on the path of massive digitization being realised through Digital India Mission. Internet connection per household is on the rise every year. This transforms into a massive data set that has the potential to improvise the public services delivery system into an effective service mechanism for Indian citizens. In fact, when compared to developed nations, this capacity is being underutilized in India. This is particularly true for administrative system in rural areas. The present paper focuses on the need for big data analytics adaptation in Indian rural administration and its contribution towards development of the country on a faster pace. Results of the research focussed on the need for increasing awareness and serious capacity building of the government personnel working for rural development with regard to big data analytics and its utility for development of the country. Multiple public policies are framed and implemented for rural development yet the results are not as effective as they should be. Big data has a major role to play in this context as can assist in improving both policy making and implementation aiming at all round development of the country.Keywords: Digital India Mission, public service delivery system, public policy, Indian administration
Procedia PDF Downloads 15924506 Developing Drought and Heat Stress Tolerant Chickpea Genotypes
Authors: Derya Yucel, Nigar Angın, Dürdane Mart, Meltem Turkeri, Volkan Catalkaya, Celal Yucel
Abstract:
Chickpea (Cicer arietinum L.) with high protein content is a vital food, especially in under-developed and developing countries for the people who do not consume enough meat due to low-income level. The objective of the proposed study is to evaluate growing, yield and yield components of chickpea genotypes under Mediterranean condition so determine tolerance of chickpea genotypes against drought and heat stress. For this purpose, a total of 34 chickpea genotypes were used as material. The experiment was conducted according to factorial randomized complete block design with 3 reps at the Eastern Mediterranean Research Institute, Adana, TURKEY for 2014-15 growing season under three different growing conditions (Winter sowing, irrigated-late sowing and non-irrigated- late sowing). According to results of this experiment, vegetative period, flowering time, poding time, maturity time, plant height, height of first pod, seed yield and 100 seed weight were ranged between 68.33 to 78.77 days, 94.22 to 85.00 days, 94.11 to 106.44 days, 198.56 to 214.44 days, 37.18 to 64.89 cm, 18.33 to 34.83 cm, 417.1 to 1746.4 kg/ha and 14.02 to 45.02 g, respectively. Among the chickpea genotypes, the Aksu, Arda, Çakır, F4 09 (X 05 TH 21-16189), FLIP 03-108 were least affected by drought and heat stress. Therefore, these genotypes can be used as sources of drought and heat tolerance in further breeding programme for evolving the drought and heat tolerant genotypes in chickpea.Keywords: chickpea, drought stress, heat stress, yield
Procedia PDF Downloads 22724505 Effects of Adding Condensed Tannin from Shrub and Tree Leaves in Concentrate on Sheep Production Fed on Elephant Grass as a Basal Diet
Authors: Kusmartono, Siti Chuzaemi, Hartutik dan Mashudi
Abstract:
Two studies were conducted involving an in vitro (Expt 1) and in vivo (Expt 2) measurements. Expt 1. aimed to evaluate effects of adding CT extracts on gas production and efficiency of microbial protein synthesis (EMPS), Expt 2 aimed to evaluate effects of supplementing shrub/tree leaves as CT source on feed consumption, digestibility, N retention, body weight gain and dressing percentage of growing sheep fed on elephant grass (EG) as a basal diet.Ten shrub and tree leaves used as CT sources were wild sunflower (Tithonia diversifolia), mulberry (Morus macroura), cassava (Manihot utilissima), avicienna (Avicennia marina), calliandra (Calliandra calothyrsus), sesbania (Sesbania grandiflora), acacia (acacia vilosa), glyricidia (Glyricidia sepium), jackfruit (Artocarpus heterophyllus), moringa (Moringa oleifera). The treatments applied in Expt 1 were: T1=Elephant grass (60%)+concentrate (40%); T2 = T1 + CT (3% DM); T3= T2 + PEG; T4 = T1 + CT (3.5% DM); T5 = T4 + PEG; T6 = T1 + CT (4% DM) and T7 = T6 + PEG. Data obtained were analysed using Randomized Block Design. Statistical analyses showed that treatments significanty affected (P<0.05) total gas production and EMPS. The lowest values of total gas production (45.9 ml/500 mg DM) and highest value of EMPS (64.6 g/kg BOTR) were observed in the treatment T4 (3.5% CT from cassava leave extract). Based on this result it was concluded that this treatment was the best and was chosen for further investigation using in vivo method. The treatmets applied for in vivo trial were: T1 = EG (60%) + concentrate (40%); T2 = T1 + dried cassava leave (equivalent to 3.5% CT); T3 = T2 + PEG. 18 growing sheep aging of 8-9 months and weighing of 23.67kg ± 1.23 were used in Expt 2. Results of in vivo study showed that treatments significanty affected (P<0.05) nutrients intake and digestibility (DM, OM and CP). N retention for sheep receiving treatment T2 were significantly higher (P<0.05; 15.6 g/d) than T1 (9.1 g/d) and T3 (8.53 g/d). Similar results were obtained for daily weight gain where T2 were the highest (62.79 g/d), followed by T1 (51.9 g/d) and T3 (52.85 g/d). Dressing percentage of T2 was the highest (51.54%) followed by T1 (49.61%) and T3 (49.32%). It can be concluded that adding adding dried cassava leaves did not reduce palatability due to CT, but rather increased OM digestibility and hence feed consumption was improved. N retention was increased due to the action of CT in the cassava leaves and this may have explained a higher input of N into duodenum which was further led to higer daily weight gain and dressing percentage.Keywords: in vitro gas production, sheep, shrub and tree leaves, condensed tannin
Procedia PDF Downloads 26424504 Calibration of Syringe Pumps Using Interferometry and Optical Methods
Authors: E. Batista, R. Mendes, A. Furtado, M. C. Ferreira, I. Godinho, J. A. Sousa, M. Alvares, R. Martins
Abstract:
Syringe pumps are commonly used for drug delivery in hospitals and clinical environments. These instruments are critical in neonatology and oncology, where any variation in the flow rate and drug dosing quantity can lead to severe incidents and even death of the patient. Therefore it is very important to determine the accuracy and precision of these devices using the suitable calibration methods. The Volume Laboratory of the Portuguese Institute for Quality (LVC/IPQ) uses two different methods to calibrate syringe pumps from 16 nL/min up to 20 mL/min. The Interferometric method uses an interferometer to monitor the distance travelled by a pusher block of the syringe pump in order to determine the flow rate. Therefore, knowing the internal diameter of the syringe with very high precision, the travelled distance, and the time needed for that travelled distance, it was possible to calculate the flow rate of the fluid inside the syringe and its uncertainty. As an alternative to the gravimetric and the interferometric method, a methodology based on the application of optical technology was also developed to measure flow rates. Mainly this method relies on measuring the increase of volume of a drop over time. The objective of this work is to compare the results of the calibration of two syringe pumps using the different methodologies described above. The obtained results were consistent for the three methods used. The uncertainties values were very similar for all the three methods, being higher for the optical drop method due to setup limitations.Keywords: calibration, flow, interferometry, syringe pump, uncertainty
Procedia PDF Downloads 10724503 Autonomous Kuka Youbot Navigation Based on Machine Learning and Path Planning
Authors: Carlos Gordon, Patricio Encalada, Henry Lema, Diego Leon, Dennis Chicaiza
Abstract:
The following work presents a proposal of autonomous navigation of mobile robots implemented in an omnidirectional robot Kuka Youbot. We have been able to perform the integration of robotic operative system (ROS) and machine learning algorithms. ROS mainly provides two distributions; ROS hydro and ROS Kinect. ROS hydro allows managing the nodes of odometry, kinematics, and path planning with statistical and probabilistic, global and local algorithms based on Adaptive Monte Carlo Localization (AMCL) and Dijkstra. Meanwhile, ROS Kinect is responsible for the detection block of dynamic objects which can be in the points of the planned trajectory obstructing the path of Kuka Youbot. The detection is managed by artificial vision module under a trained neural network based on the single shot multibox detector system (SSD), where the main dynamic objects for detection are human beings and domestic animals among other objects. When the objects are detected, the system modifies the trajectory or wait for the decision of the dynamic obstacle. Finally, the obstacles are skipped from the planned trajectory, and the Kuka Youbot can reach its goal thanks to the machine learning algorithms.Keywords: autonomous navigation, machine learning, path planning, robotic operative system, open source computer vision library
Procedia PDF Downloads 17524502 4G LTE Dynamic Pricing: The Drivers, Benefits, and Challenges
Authors: Ahmed Rashad Harb Riad Ismail
Abstract:
The purpose of this research is to study the potential of Dynamic Pricing if deployed by mobile operators and analyse its effects from both operators and consumers side. Furthermore, to conclude, throughout the research study, the recommended conditions for successful Dynamic Pricing deployment, recommended factors identifying the type of markets where Dynamic Pricing can be effective, and proposal for a Dynamic Pricing stakeholders’ framework were presented. Currently, the mobile telecommunications industry is witnessing a dramatic growth rate in the data consumption, being fostered mainly by higher data speed technology as the 4G LTE and by the smart devices penetration rates. However, operators’ revenue from data services lags behind and is decupled from this data consumption growth. Pricing strategy is a key factor affecting this ecosystem. Since the introduction of the 4G LTE technology will increase the pace of data growth in multiples, consequently, if pricing strategies remain constant, then the revenue and usage gap will grow wider, risking the sustainability of the ecosystem. Therefore, this research study is focused on Dynamic Pricing for 4G LTE data services, researching the drivers, benefits and challenges of 4G LTE Dynamic Pricing and the feasibility of its deployment in practice from different perspectives including operators, regulators, consumers, and telecommunications equipment manufacturers point of views.Keywords: LTE, dynamic pricing, EPC, research
Procedia PDF Downloads 33024501 Prediction of Wind Speed by Artificial Neural Networks for Energy Application
Authors: S. Adjiri-Bailiche, S. M. Boudia, H. Daaou, S. Hadouche, A. Benzaoui
Abstract:
In this work the study of changes in the wind speed depending on the altitude is calculated and described by the model of the neural networks, the use of measured data, the speed and direction of wind, temperature and the humidity at 10 m are used as input data and as data targets at 50m above sea level. Comparing predict wind speeds and extrapolated at 50 m above sea level is performed. The results show that the prediction by the method of artificial neural networks is very accurate.Keywords: MATLAB, neural network, power low, vertical extrapolation, wind energy, wind speed
Procedia PDF Downloads 691