Search results for: data source
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28346

Search results for: data source

28226 A Web-Based Real Property Updating System for Efficient and Sustainable Urban Development: A Case Study in Ethiopia

Authors: Eyosiyas Aga

Abstract:

The development of information communication technology has transformed the paper-based mapping and land registration processes to a computerized and networked system. The computerization and networking of real property information system play a vital role in good governance and sustainable development of emerging countries through cost effective, easy and accessible service delivery for the customer. The efficient, transparent and sustainable real property system is becoming the basic infrastructure for the urban development thus improve the data management system and service delivery in the organizations. In Ethiopia, the real property administration is paper based as a result, it confronted problems of data management, illegal transactions, corruptions, and poor service delivery. In order to solve this problem and to facilitate real property market, the implementation of web-based real property updating system is crucial. A web-based real property updating is one of the automation (computerizations) methods to facilitate data sharing, reduce time and cost of the service delivery in real property administration system. In additions, it is useful for the integration of data onto different information systems and organizations. This system is designed by combining open source software which supported by open Geo-spatial consortium. The web-based system is mainly designed by using open source software with the help of open Geo-spatial Consortium. The Open Geo-spatial Consortium standards such as the Web Feature Service and Web Map Services are the most widely used standards to support and improves web-based real property updating. These features allow the integration of data from different sources, and it can be used to maintain consistency of data throughout transactions. The PostgreSQL and Geoserver are used to manage and connect a real property data to the flex viewer and user interface. The system is designed for both internal updating system (municipality); which is mainly updating of spatial and textual information, and the external system (customer) which focus on providing and interacting with the customer. This research assessed the potential of open source web applications and adopted this technology for real property updating system in Ethiopia through simple, cost effective and secured way. The system is designed by combining and customizing open source software to enhance the efficiency of the system in cost effective way. The existing workflow for real property updating is analyzed to identify the bottlenecks, and the new workflow is designed for the system. The requirement is identified through questionnaire and literature review, and the system is prototype for the study area. The research mainly aimed to integrate human resource with technology in designing of the system to reduce data inconsistency and security problems. In additions, the research reflects on the current situation of real property administration and contributions of effective data management system for efficient, transparent and sustainable urban development in Ethiopia.

Keywords: cadaster, real property, sustainable, transparency, web feature service, web map service

Procedia PDF Downloads 267
28225 Synthesis and Characterization of Iron Modified Geopolymer and Its Resistance against Chloride and Sulphate

Authors: Noor-ul-Amin, Lubna Nawab, Sabiha Sultana

Abstract:

Geopolymer with different silica to alumina ratio with iron have been synthesized using sodium silicate, aluminum, and iron salts as a source of silica, alumina and iron source, and sodium/potassium hydroxide as an alkaline medium. The iron source will be taken from iron (III) salts and laterite clay samples. Laterite has been used as a natural source of iron in modified geopolymer. The synthesized iron modified geopolymer was submitted to the different aggressive environment, including chloride and sulphate solutions in different concentration. Different experimental techniques, including XRF, XRD, and FTIR, were used to study the bonding nature and effect of aggressive environment on geopolymer. The major phases formed during geopolymerization are sodalite (Na₄Al₃Si₃O₁₂Cl), albite (NaAlSi₃O₈), hematite (Fe₂O₃), and chabazite as confirmed from the XRD results. The resulting geopolymer showed greater resistance to sulphate and chloride as compared to the normal geopolymer.

Keywords: modified geopolymer, laterite, chloride, sulphate

Procedia PDF Downloads 156
28224 Dido: An Automatic Code Generation and Optimization Framework for Stencil Computations on Distributed Memory Architectures

Authors: Mariem Saied, Jens Gustedt, Gilles Muller

Abstract:

We present Dido, a source-to-source auto-generation and optimization framework for multi-dimensional stencil computations. It enables a large programmer community to easily and safely implement stencil codes on distributed-memory parallel architectures with Ordered Read-Write Locks (ORWL) as an execution and communication back-end. ORWL provides inter-task synchronization for data-oriented parallel and distributed computations. It has been proven to guarantee equity, liveness, and efficiency for a wide range of applications, particularly for iterative computations. Dido consists mainly of an implicitly parallel domain-specific language (DSL) implemented as a source-level transformer. It captures domain semantics at a high level of abstraction and generates parallel stencil code that leverages all ORWL features. The generated code is well-structured and lends itself to different possible optimizations. In this paper, we enhance Dido to handle both Jacobi and Gauss-Seidel grid traversals. We integrate temporal blocking to the Dido code generator in order to reduce the communication overhead and minimize data transfers. To increase data locality and improve intra-node data reuse, we coupled the code generation technique with the polyhedral parallelizer Pluto. The accuracy and portability of the generated code are guaranteed thanks to a parametrized solution. The combination of ORWL features, the code generation pattern and the suggested optimizations, make of Dido a powerful code generation framework for stencil computations in general, and for distributed-memory architectures in particular. We present a wide range of experiments over a number of stencil benchmarks.

Keywords: stencil computations, ordered read-write locks, domain-specific language, polyhedral model, experiments

Procedia PDF Downloads 127
28223 Dynamic Compensation for Environmental Temperature Variation in the Coolant Refrigeration Cycle as a Means of Increasing Machine-Tool Precision

Authors: Robbie C. Murchison, Ibrahim Küçükdemiral, Andrew Cowell

Abstract:

Thermal effects are the largest source of dimensional error in precision machining, and a major proportion is caused by ambient temperature variation. The use of coolant is a primary means of mitigating these effects, but there has been limited work on coolant temperature control. This research critically explored whether CNC-machine coolant refrigeration systems adapted to actively compensate for ambient temperature variation could increase machining accuracy. Accuracy data were collected from operators’ checklists for a CNC 5-axis mill and statistically reduced to bias and precision metrics for observations of one day over a sample period of 27 days. Temperature data were collected using three USB dataloggers in ambient air, the chiller inflow, and the chiller outflow. The accuracy and temperature data were analysed using Pearson correlation, then the thermodynamics of the system were described using system identification with MATLAB. It was found that 75% of thermal error is reflected in the hot coolant temperature but that this is negligibly dependent on ambient temperature. The effect of the coolant refrigeration process on hot coolant outflow temperature was also found to be negligible. Therefore, the evidence indicated that it would not be beneficial to adapt coolant chillers to compensate for ambient temperature variation. However, it is concluded that hot coolant outflow temperature is a robust and accessible source of thermal error data which could be used for prevention strategy evaluation or as the basis of other thermal error strategies.

Keywords: CNC manufacturing, machine-tool, precision machining, thermal error

Procedia PDF Downloads 89
28222 Space Vector PWM and Model Predictive Control for Voltage Source Inverter Control

Authors: Irtaza M. Syed, Kaamran Raahemifar

Abstract:

In this paper, we present a comparative assessment of Space Vector Pulse Width Modulation (SVPWM) and Model Predictive Control (MPC) for two-level three phase (2L-3P) Voltage Source Inverter (VSI). VSI with associated system is subjected to both control techniques and the results are compared. Matlab/Simulink was used to model, simulate and validate the control schemes. Findings of this study show that MPC is superior to SVPWM in terms of total harmonic distortion (THD) and implementation.

Keywords: voltage source inverter, space vector pulse width modulation, model predictive control, comparison

Procedia PDF Downloads 508
28221 Mobile Devices and E-Learning Systems as a Cost-Effective Alternative for Digitizing Paper Quizzes and Questionnaires in Social Work

Authors: K. Myška, L. Pilařová

Abstract:

The article deals with possibilities of using cheap mobile devices with the combination of free or open source software tools as an alternative to professional hardware and software equipment. Especially in social work, it is important to find cheap yet functional solution that can compete with complex but expensive solutions for digitizing paper materials. Our research was focused on the analysis of cheap and affordable solutions for digitizing the most frequently used paper materials that are being commonly used by terrain workers in social work. We used comparative analysis as a research method. Social workers need to process data from paper forms quite often. It is still more affordable, time and cost-effective to use paper forms to get feedback in many cases. Collecting data from paper quizzes and questionnaires can be done with the help of professional scanners and software. These technologies are very powerful and have advanced options for digitizing and processing digitized data, but are also very expensive. According to results of our study, the combination of open source software and mobile phone or cheap scanner can be considered as a cost-effective alternative to professional equipment.

Keywords: digitalization, e-learning, mobile devices, questionnaire

Procedia PDF Downloads 151
28220 The BNCT Project Using the Cf-252 Source: Monte Carlo Simulations

Authors: Marta Błażkiewicz-Mazurek, Adam Konefał

Abstract:

The project can be divided into three main parts: i. modeling the Cf-252 neutron source and conducting an experiment to verify the correctness of the obtained results, ii. design of the BNCT system infrastructure, iii. analysis of the results from the logical detector. Modeling of the Cf-252 source included designing the shape and size of the source as well as the energy and spatial distribution of emitted neutrons. Two options were considered: a point source and a cylindrical spatial source. The energy distribution corresponded to various spectra taken from specialized literature. Directionally isotropic neutron emission was simulated. The simulation results were compared with experimental values determined using the activation detector method using indium foils and cadmium shields. The relative fluence rate of thermal and resonance neutrons was compared in the chosen places in the vicinity of the source. The second part of the project related to the modeling of the BNCT infrastructure consisted of developing a simulation program taking into account all the essential components of this system. Materials with moderating, absorbing, and backscattering properties of neutrons were adopted into the project. Additionally, a gamma radiation filter was introduced into the beam output system. The analysis of the simulation results obtained using a logical detector located at the beam exit from the BNCT infrastructure included neutron energy and their spatial distribution. Optimization of the system involved changing the size and materials of the system to obtain a suitable collimated beam of thermal neutrons.

Keywords: BNCT, Monte Carlo, neutrons, simulation, modeling

Procedia PDF Downloads 29
28219 Evaluation of Cultural Landscape Perception in Waterfront Historic Districts Based on Multi-source Data - Taking Venice and Suzhou as Examples

Authors: Shuyu Zhang

Abstract:

The waterfront historical district, as a type of historical districts on the verge of waters such as the sea, lake, and river, have a relatively special urban form. In the past preservation and renewal of traditional historic districts, there have been many discussions on the land range, and the waterfront and marginal spaces are easily overlooked. However, the waterfront space of the historic districts, as a cultural landscape heritage combining historical buildings and landscape elements, has strong ecological and sustainable values. At the same time, Suzhou and Venice, as sister water cities in history, have more waterfront spaces that can be compared in urban form and other levels. Therefore, this paper focuses on the waterfront historic districts in Venice and Suzhou, establishes quantitative evaluation indicators for environmental perception, makes analogies, and promotes the renewal and activation of the entire historical district by improving the spatial quality and vitality of the waterfront area. First, this paper uses multi-source data for analysis, such as Baidu Maps and Google Maps API to crawl the street view of the waterfront historic districts, uses machine learning algorithms to analyze the proportion of cultural landscape elements such as green viewing rate in the street view pictures, and uses space syntax software to make quantitative selectivity analysis, so as to establish environmental perception evaluation indicators for the waterfront historic districts. Finally, by comparing and summarizing the waterfront historic districts in Venice and Suzhou, it reveals their similarities and differences, characteristics and conclusions, and hopes to provide a reference for the heritage preservation and renewal of other waterfront historic districts.

Keywords: waterfront historical district, cultural landscape, perception, multi-source Data

Procedia PDF Downloads 197
28218 TessPy – Spatial Tessellation Made Easy

Authors: Jonas Hamann, Siavash Saki, Tobias Hagen

Abstract:

Discretization of urban areas is a crucial aspect in many spatial analyses. The process of discretization of space into subspaces without overlaps and gaps is called tessellation. It helps understanding spatial space and provides a framework for analyzing geospatial data. Tessellation methods can be divided into two groups: regular tessellations and irregular tessellations. While regular tessellation methods, like squares-grids or hexagons-grids, are suitable for addressing pure geometry problems, they cannot take the unique characteristics of different subareas into account. However, irregular tessellation methods allow the border between the subareas to be defined more realistically based on urban features like a road network or Points of Interest (POI). Even though Python is one of the most used programming languages when it comes to spatial analysis, there is currently no library that combines different tessellation methods to enable users and researchers to compare different techniques. To close this gap, we are proposing TessPy, an open-source Python package, which combines all above-mentioned tessellation methods and makes them easily accessible to everyone. The core functions of TessPy represent the five different tessellation methods: squares, hexagons, adaptive squares, Voronoi polygons, and city blocks. By using regular methods, users can set the resolution of the tessellation which defines the finesse of the discretization and the desired number of tiles. Irregular tessellation methods allow users to define which spatial data to consider (e.g., amenity, building, office) and how fine the tessellation should be. The spatial data used is open-source and provided by OpenStreetMap. This data can be easily extracted and used for further analyses. Besides the methodology of the different techniques, the state-of-the-art, including examples and future work, will be discussed. All dependencies can be installed using conda or pip; however, the former is more recommended.

Keywords: geospatial data science, geospatial data analysis, tessellations, urban studies

Procedia PDF Downloads 128
28217 Estimation of the Road Traffic Emissions and Dispersion in the Developing Countries Conditions

Authors: Hicham Gourgue, Ahmed Aharoune, Ahmed Ihlal

Abstract:

We present in this work our model of road traffic emissions (line sources) and dispersion of these emissions, named DISPOLSPEM (Dispersion of Poly Sources and Pollutants Emission Model). In its emission part, this model was designed to keep the consistent bottom-up and top-down approaches. It also allows to generate emission inventories from reduced input parameters being adapted to existing conditions in Morocco and in the other developing countries. While several simplifications are made, all the performance of the model results are kept. A further important advantage of the model is that it allows the uncertainty calculation and emission rate uncertainty according to each of the input parameters. In the dispersion part of the model, an improved line source model has been developed, implemented and tested against a reference solution. It provides improvement in accuracy over previous formulas of line source Gaussian plume model, without being too demanding in terms of computational resources. In the case study presented here, the biggest errors were associated with the ends of line source sections; these errors will be canceled by adjacent sections of line sources during the simulation of a road network. In cases where the wind is parallel to the source line, the use of the combination discretized source and analytical line source formulas minimizes remarkably the error. Because this combination is applied only for a small number of wind directions, it should not excessively increase the calculation time.

Keywords: air pollution, dispersion, emissions, line sources, road traffic, urban transport

Procedia PDF Downloads 442
28216 Effect of Nitrogen Source on Production of CMCase by Bacillus megaterium 1295S Isolated from Sewage Treatment Plants

Authors: Adel A. S. Al-Gheethi, M. O. Abdul-Monem

Abstract:

Cellulase-producing bacteria were isolated from wastewater and sludge, and identified as Bacillus megaterium 1295S, Sporosarcina pasteurii 586S, Bacillus subtilis 117S, Burkholderia cepacia 120S and Staphylococcus xylosus 222W. Among bacteria, B. megaterium 1295S was the best cellulase producer under the catabolic repression and was therefore selected to study the factors affecting cellulase production. The optimum conditions for cellulase production were observed in CMC-Yeast Extract (CYE) agar medium (pH 6.5) inoculated with 0.4 mL of bacterial culture and incubated at 45˚C for 72 h. Twenty amino acids were introduced into the production medium as nitrogen source to investigate the production of cellulase in presence of amino acids in comparison to peptone (as an organic source) and sodium nitrate (as an inorganic source). The results found that the maximum production of cellulase was recorded at 50 ppm when L-hydroxy proline, L-arginine, glycine, L-histidine, L-leucine, DL-isoleucine, DL-β-phenylalanine were used as sole nitrogen sources and at 100 ppm when DL-threonine, L-ornithine 12.29, L-proline were used as sole nitrogen sources. The highest biomass yield was found when glycine 5 ppm and DL-serine 100 ppm used as a nitrogen source.

Keywords: CMCase, Bacillus megaterium 1295S, factors, amino acids

Procedia PDF Downloads 448
28215 The Evaluation Model for the Quality of Software Based on Open Source Code

Authors: Li Donghong, Peng Fuyang, Yang Guanghua, Su Xiaoyan

Abstract:

Using open source code is a popular method of software development. How to evaluate the quality of software becomes more important. This paper introduces an evaluation model. The model evaluates the quality from four dimensions: technology, production, management, and development. Each dimension includes many indicators. The weight of indicator can be modified according to the purpose of evaluation. The paper also introduces a method of using the model. The evaluating result can provide good advice for evaluating or purchasing the software.

Keywords: evaluation model, software quality, open source code, evaluation indicator

Procedia PDF Downloads 389
28214 Large Eddy Simulation of Hydrogen Deflagration in Open Space and Vented Enclosure

Authors: T. Nozu, K. Hibi, T. Nishiie

Abstract:

This paper discusses the applicability of the numerical model for a damage prediction method of the accidental hydrogen explosion occurring in a hydrogen facility. The numerical model was based on an unstructured finite volume method (FVM) code “NuFD/FrontFlowRed”. For simulating unsteady turbulent combustion of leaked hydrogen gas, a combination of Large Eddy Simulation (LES) and a combustion model were used. The combustion model was based on a two scalar flamelet approach, where a G-equation model and a conserved scalar model expressed a propagation of premixed flame surface and a diffusion combustion process, respectively. For validation of this numerical model, we have simulated the previous two types of hydrogen explosion tests. One is open-space explosion test, and the source was a prismatic 5.27 m3 volume with 30% of hydrogen-air mixture. A reinforced concrete wall was set 4 m away from the front surface of the source. The source was ignited at the bottom center by a spark. The other is vented enclosure explosion test, and the chamber was 4.6 m × 4.6 m × 3.0 m with a vent opening on one side. Vent area of 5.4 m2 was used. Test was performed with ignition at the center of the wall opposite the vent. Hydrogen-air mixtures with hydrogen concentrations close to 18% vol. were used in the tests. The results from the numerical simulations are compared with the previous experimental data for the accuracy of the numerical model, and we have verified that the simulated overpressures and flame time-of-arrival data were in good agreement with the results of the previous two explosion tests.

Keywords: deflagration, large eddy simulation, turbulent combustion, vented enclosure

Procedia PDF Downloads 244
28213 PSRR Enhanced LDO Regulator Using Noise Sensing Circuit

Authors: Min-ju Kwon, Chae-won Kim, Jeong-yun Seo, Hee-guk Chae, Yong-seo Koo

Abstract:

In this paper, we presented the LDO (low-dropout) regulator which enhanced the PSRR by applying the constant current source generation technique through the BGR (Band Gap Reference) to form the noise sensing circuit. The current source through the BGR has a constant current value even if the applied voltage varies. Then, the noise sensing circuit, which is composed of the current source through the BGR, operated between the error amplifier and the pass transistor gate of the LDO regulator. As a result, the LDO regulator has a PSRR of -68.2 dB at 1k Hz, -45.85 dB at 1 MHz and -45 dB at 10 MHz. the other performance of the proposed LDO was maintained at the same level of the conventional LDO regulator.

Keywords: LDO regulator, noise sensing circuit, current reference, pass transistor

Procedia PDF Downloads 283
28212 Benefits of Hybrid Mix in Renewable Energy and Integration with E-Efficient Compositions

Authors: Ahmed Khalil

Abstract:

Increased energy demands around the world have led to the raise in power production which has resulted with more greenhouse gas emissions through fossil sources. These fossil sources and emissions cause deterioration in echo-system. Therefore, renewable energy sources come to the scene as echo-friendly and clean energy sourcing, whereas the electrical devices and energy needs decrease in the timeline. Each of these renewable energy sources contribute to the reduction of greenhouse gases and mitigate environmental deterioration. However, there are also some general and source-specific challenges, which influence the choice of the investors. The most prominent general challenge that effects end-users’ comfort and reliability is usually determined as the intermittence which derives from the diversions of source conditions, due to nature dynamics and uncontrolled periodic changes. Research and development professionals strive to mitigate intermittence challenge through material improvement for each renewable source whereas hybrid source mix stand as a solution. This solution prevails well, when single renewable technologies are upgraded further. On the other hand, integration of energy efficient devices and systems, raise the affirmative effect of such solution in means of less energy requirement in sustainability composition or scenario. This paper provides a glimpse on the advantages of composing renewable source mix versus single usage, with contribution of sampled e-efficient systems and devices. Accordingly it demonstrates the extended benefits, through planning and predictive estimation stages of Ahmadi Town Projects in Kuwait.

Keywords: e-efficient systems, hybrid source, intermittence challenge, renewable energy

Procedia PDF Downloads 136
28211 Utilization of Online Risk Mapping Techniques versus Desktop Geospatial Tools in Making Multi-Hazard Risk Maps for Italy

Authors: Seyed Vahid Kamal Alavi

Abstract:

Italy has experienced a notable quantity and impact of disasters due to natural hazards and technological accidents caused by diverse risk sources on its physical, technological, and human/sociological infrastructures during past decade. This study discusses the frequency and impacts of the most three physical devastating natural hazards in Italy for the period 2000–2013. The approach examines the reliability of a range of open source WebGIS techniques versus a proposed multi-hazard risk management methodology. Spatial and attribute data which include USGS publically available hazard data and thirteen years Munich RE recorded data for Italy with different severities have been processed, visualized in a GIS (Geographic Information System) framework. Comparison of results from the study showed that the multi-hazard risk maps generated using open source techniques do not provide a reliable system to analyze the infrastructures losses in respect to national risk sources while they can be adopted for general international risk management purposes. Additionally, this study establishes the possibility to critically examine and calibrate different integrated techniques in evaluating what better protection measures can be taken in an area.

Keywords: multi-hazard risk mapping, risk management, GIS, Italy

Procedia PDF Downloads 371
28210 Phishing Attacks Facilitated by Open Source Intelligence

Authors: Urva Maryam

Abstract:

The information has become an important asset to the current cosmos. Globally, various tactics are being observed to confine the spread of information as it makes people vulnerable to security attacks. Open Source Intelligence (OSINT) is a publicly available source that has disseminated information about users or websites, companies, and various organizations. This paper focuses on the quantitative method of exploring various OSINT tools that reveal public information of personals. This information could further facilitate phishing attacks. Phishing attacks can be launched on email addresses, open ports, and unsecure web-surfing. This study allows to analyze the information retrieved from OSINT tools, i.e. theHarvester, and Maltego that can be used to send phishing attacks to individuals.

Keywords: e-mail spoofing, Maltego, OSINT, phishing, spear phishing, theHarvester

Procedia PDF Downloads 148
28209 Constructing a Two-Tier Test about Source Current to Diagnose Pre-Service Elementary School Teacher’ Misconceptions

Authors: Abdeljalil Metioui

Abstract:

The purpose of this article is to present the results of two-stage qualitative research. The first involved the identification of the alternative conceptions of 80 elementary pre-service teachers from Quebec in Canada about the operation of simple electrical circuits. To do this, they completed a two-choice questionnaire (true or false) with justification. Data analysis identifies many conceptual difficulties. For example, for their majority, whatever the electrical device that composes an electrical circuit, the current source (power supply), and the generated electrical power is constant. The second step was to develop a double multiple-choice questionnaire based on the identified designs. It allows teachers to quickly diagnose their students' conceptions and take them into account in their teaching.

Keywords: development, electrical circuits, two-tier diagnostic test, secondary and high school

Procedia PDF Downloads 112
28208 Multivariate Assessment of Mathematics Test Scores of Students in Qatar

Authors: Ali Rashash Alzahrani, Elizabeth Stojanovski

Abstract:

Data on various aspects of education are collected at the institutional and government level regularly. In Australia, for example, students at various levels of schooling undertake examinations in numeracy and literacy as part of NAPLAN testing, enabling longitudinal assessment of such data as well as comparisons between schools and states within Australia. Another source of educational data collected internationally is via the PISA study which collects data from several countries when students are approximately 15 years of age and enables comparisons in the performance of science, mathematics and English between countries as well as ranking of countries based on performance in these standardised tests. As well as student and school outcomes based on the tests taken as part of the PISA study, there is a wealth of other data collected in the study including parental demographics data and data related to teaching strategies used by educators. Overall, an abundance of educational data is available which has the potential to be used to help improve educational attainment and teaching of content in order to improve learning outcomes. A multivariate assessment of such data enables multiple variables to be considered simultaneously and will be used in the present study to help develop profiles of students based on performance in mathematics using data obtained from the PISA study.

Keywords: cluster analysis, education, mathematics, profiles

Procedia PDF Downloads 126
28207 Changing Arbitrary Data Transmission Period by Using Bluetooth Module on Gas Sensor Node of Arduino Board

Authors: Hiesik Kim, Yong-Beom Kim, Jaheon Gu

Abstract:

Internet of Things (IoT) applications are widely serviced and spread worldwide. Local wireless data transmission technique must be developed to rate up with some technique. Bluetooth wireless data communication is wireless technique is technique made by Special Inter Group (SIG) using the frequency range 2.4 GHz, and it is exploiting Frequency Hopping to avoid collision with a different device. To implement experiment, equipment for experiment transmitting measured data is made by using Arduino as open source hardware, gas sensor, and Bluetooth module and algorithm controlling transmission rate is demonstrated. Experiment controlling transmission rate also is progressed by developing Android application receiving measured data, and controlling this rate is available at the experiment result. It is important that in the future, improvement for communication algorithm be needed because a few error occurs when data is transferred or received.

Keywords: Arduino, Bluetooth, gas sensor, IoT, transmission

Procedia PDF Downloads 277
28206 Evaluation of Satellite and Radar Rainfall Product over Seyhan Plain

Authors: Kazım Kaba, Erdem Erdi, M. Akif Erdoğan, H. Mustafa Kandırmaz

Abstract:

Rainfall is crucial data source for very different discipline such as agriculture, hydrology and climate. Therefore rain rate should be known well both spatial and temporal for any area. Rainfall is measured by using rain-gauge at meteorological ground stations traditionally for many years. At the present time, rainfall products are acquired from radar and satellite images with a temporal and spatial continuity. In this study, we investigated the accuracy of these rainfall data according to rain-gauge data. For this purpose, we used Adana-Hatay radar hourly total precipitation product (RN1) and Meteosat convective rainfall rate (CRR) product over Seyhan plain. We calculated daily rainfall values from RN1 and CRR hourly precipitation products. We used the data of rainy days of four stations located within range of the radar from October 2013 to November 2015. In the study, we examined two rainfall data over Seyhan plain and the correlation between the rain-gauge data and two raster rainfall data was observed lowly.

Keywords: meteosat, radar, rainfall, rain-gauge, Turkey

Procedia PDF Downloads 328
28205 Remote Radiation Mapping Based on UAV Formation

Authors: Martin Arguelles Perez, Woosoon Yim, Alexander Barzilov

Abstract:

High-fidelity radiation monitoring is an essential component in the enhancement of the situational awareness capabilities of the Department of Energy’s Office of Environmental Management (DOE-EM) personnel. In this paper, multiple units of unmanned aerial vehicles (UAVs) each equipped with a cadmium zinc telluride (CZT) gamma-ray sensor are used for radiation source localization, which can provide vital real-time data for the EM tasks. To achieve this goal, a fully autonomous system of multicopter-based UAV swarm in 3D tetrahedron formation is used for surveying the area of interest and performing radiation source localization. The CZT sensor used in this study is suitable for small-size multicopter UAVs due to its small size and ease of interfacing with the UAV’s onboard electronics for high-resolution gamma spectroscopy enabling the characterization of radiation hazards. The multicopter platform with a fully autonomous flight feature is suitable for low-altitude applications such as radiation contamination sites. The conventional approach uses a single UAV mapping in a predefined waypoint path to predict the relative location and strength of the source, which can be time-consuming for radiation localization tasks. The proposed UAV swarm-based approach can significantly improve its ability to search for and track radiation sources. In this paper, two approaches are developed using (a) 2D planar circular (3 UAVs) and (b) 3D tetrahedron formation (4 UAVs). In both approaches, accurate estimation of the gradient vector is crucial for heading angle calculation. Each UAV carries the CZT sensor; the real-time radiation data are used for the calculation of a bulk heading vector for the swarm to achieve a UAV swarm’s source-seeking behavior. Also, a spinning formation is studied for both cases to improve gradient estimation near a radiation source. In the 3D tetrahedron formation, a UAV located closest to the source is designated as a lead unit to maintain the tetrahedron formation in space. Such a formation demonstrated a collective and coordinated movement for estimating a gradient vector for the radiation source and determining an optimal heading direction of the swarm. The proposed radiation localization technique is studied by computer simulation and validated experimentally in the indoor flight testbed using gamma sources. The technology presented in this paper provides the capability to readily add/replace radiation sensors to the UAV platforms in the field conditions enabling extensive condition measurement and greatly improving situational awareness and event management. Furthermore, the proposed radiation localization approach allows long-term measurements to be efficiently performed at wide areas of interest to prevent disasters and reduce dose risks to people and infrastructure.

Keywords: radiation, unmanned aerial system(UAV), source localization, UAV swarm, tetrahedron formation

Procedia PDF Downloads 99
28204 Preparation of Li Ion Conductive Ceramics via Liquid Process

Authors: M. Kotobuki, M. Koishi

Abstract:

Li1.5Al0.5Ti1.5 (PO4)3(LATP) has received much attention as a solid electrolyte for lithium batteries. In this study, the LATP solid electrolyte is prepared by the co-precipitation method using Li3PO4 as a Li source. The LATP is successfully prepared and the Li ion conductivities of bulk (inner crystal) and total (inner crystal and grain boundary) are 1.1 × 10-3 and 1.1 × 10-4 S cm-1, respectively. These values are comparable to the reported values, in which Li2C2O4 is used as the Li source. It is conclude that the LATP solid electrolyte can be prepared by the co-precipitation method using Li3PO4 as the Li source and this procedure has an advantage in mass production over previous procedure using Li2C2O4 because Li3PO4 is lower price reagent compared with Li2C2O4.

Keywords: co-precipitation method, lithium battery, NASICON-type electrolyte, solid electrolyte

Procedia PDF Downloads 352
28203 Numerical Modeling the Cavitating Flow in Injection Nozzle Holes

Authors: Ridha Zgolli, Hatem Kanfoudi

Abstract:

Cavitating flows inside a diesel injection nozzle hole were simulated using a mixture model. A 2D numerical model is proposed in this paper to simulate steady cavitating flows. The Reynolds-averaged Navier-Stokes equations are solved for the liquid and vapor mixture, which is considered as a single fluid with variable density which is expressed as function of the vapor volume fraction. The closure of this variable is provided by the transport equation with a source term TEM. The processes of evaporation and condensation are governed by changes in pressure within the flow. The source term is implanted in the CFD code ANSYS CFX. The influence of numerical and physical parameters is presented in details. The numerical simulations are in good agreement with the experimental data for steady flow.

Keywords: cavitation, injection nozzle, numerical simulation, k–ω

Procedia PDF Downloads 401
28202 Translation Directionality: An Eye Tracking Study

Authors: Elahe Kamari

Abstract:

Research on translation process has been conducted for more than 20 years, investigating various issues and using different research methodologies. Most recently, researchers have started to use eye tracking to study translation processes. They believed that the observable, measurable data that can be gained from eye tracking are indicators of unobservable cognitive processes happening in the translators’ mind during translation tasks. The aim of this study was to investigate directionality in translation processes through using eye tracking. The following hypotheses were tested: 1) processing the target text requires more cognitive effort than processing the source text, in both directions of translation; 2) L2 translation tasks on the whole require more cognitive effort than L1 tasks; 3) cognitive resources allocated to the processing of the source text is higher in L1 translation than in L2 translation; 4) cognitive resources allocated to the processing of the target text is higher in L2 translation than in L1 translation; and 5) in both directions non-professional translators invest more cognitive effort in translation tasks than do professional translators. The performance of a group of 30 male professional translators was compared with that of a group of 30 male non-professional translators. All the participants translated two comparable texts one into their L1 (Persian) and the other into their L2 (English). The eye tracker measured gaze time, average fixation duration, total task length and pupil dilation. These variables are assumed to measure the cognitive effort allocated to the translation task. The data derived from eye tracking only confirmed the first hypothesis. This hypothesis was confirmed by all the relevant indicators: gaze time, average fixation duration and pupil dilation. The second hypothesis that L2 translation tasks requires allocation of more cognitive resources than L1 translation tasks has not been confirmed by all four indicators. The third hypothesis that source text processing requires more cognitive resources in L1 translation than in L2 translation and the fourth hypothesis that target text processing requires more cognitive effort in L2 translation than L1 translation were not confirmed. It seems that source text processing in L2 translation can be just as demanding as in L1 translation. The final hypothesis that non-professional translators allocate more cognitive resources for the same translation tasks than do the professionals was partially confirmed. One of the indicators, average fixation duration, indicated higher cognitive effort-related values for professionals.

Keywords: translation processes, eye tracking, cognitive resources, directionality

Procedia PDF Downloads 463
28201 Phishing Attacks Facilitated by Open Source Intelligence

Authors: Urva Maryam

Abstract:

Information has become an important asset to the current cosmos. Globally, various tactics are being observed to confine the spread of information as it makes people vulnerable to security attacks. Open Source Intelligence (OSINT) is a publicly available source that has disseminated information about users or website, companies, and various organizations. This paper focuses on the quantitative method of exploring various OSINT tools that reveal public information of personals. This information could further facilitate the phishing attacks. Phishing attacks can be launched on email addresses, open ports, and unsecured web-surfing. This study allows to analyze information retrieved from OSINT tools i.e., the Harvester, and Maltego, that can be used to send phishing attacks to individuals.

Keywords: OSINT, phishing, spear phishing, email spoofing, the harvester, maltego

Procedia PDF Downloads 81
28200 Uncloaking Priceless Pieces of Evidence: Psychotherapy with an Older New Zealand Man; Contributions to Understanding Hidden Historical Phenomena and the Trans-Generation Transmission of Silent and Un-Witnessed Trauma

Authors: Joanne M. Emmens

Abstract:

This paper makes use of the case notes of a single psychoanalytically informed psychotherapy of a now 72-year-old man over a four-year period to explore the potential of qualitative data to be incorporated into a research methodology that can contribute theory and knowledge to the wider professional community involved in mental health care. The clinical material arising out of any psychoanalysis provides a potentially rich source of clinical data that could contribute valuably to our historical understanding of both individual and societal traumata. As psychoanalysis is primarily an investigation, it is argued that clinical case material is a rich source of qualitative data which has relevance for sociological and historical understandings and that it can potentially aluminate important ‘gaps’ and collective blind spots that manifest unconsciously and are a contributing factor in the transmission of trauma, silently across generations. By attending to this case material the hope is to illustrate the value of using a psychoanalytic centred methodology. It is argued that the study of individual defences and the manner in which they come into consciousness, allows an insight into group defences and the unconscious forces that contribute to the silencing or un-noticing of important sources (or originators) of mental suffering.

Keywords: dream furniture (Bion) and psychotic functioning, reverie, screen memories, selected fact

Procedia PDF Downloads 199
28199 Design of an Air and Land Multi-Element Expression Pattern of Navigation Electronic Map for Ground Vehicles under United Navigation Mechanism

Authors: Rui Liu, Pengyu Cui, Nan Jiang

Abstract:

At present, there is much research on the application of centralized management and cross-integration application of basic geographic information. However, the idea of information integration and sharing between land, sea, and air navigation targets is not deeply applied into the research of navigation information service, especially in the information expression. Targeting at this problem, the paper carries out works about the expression pattern of navigation electronic map for ground vehicles under air and land united navigation mechanism. At first, with the support from multi-source information fusion of GIS vector data, RS data, GPS data, etc., an air and land united information expression pattern is designed aiming at specific navigation task of emergency rescue in the earthquake. And then, the characteristics and specifications of the united expression of air and land navigation information under the constraints of map load are summarized and transferred into expression rules in the rule bank. At last, the related navigation experiment is implemented to evaluate the effect of the expression pattern. The experiment selects evaluation factors of the navigation task accomplishment time and the navigation error rate as the main index, and make comparisons with the traditional single information expression pattern. To sum up, the research improved the theory of navigation electronic map and laid a certain foundation for the design and realization of united navigation system in the aspect of real-time navigation information delivery.

Keywords: navigation electronic map, united navigation, multi-element expression pattern, multi-source information fusion

Procedia PDF Downloads 199
28198 Marzuq Basin Palaeozoic Petroleum System

Authors: M. Dieb, T. Hodairi

Abstract:

In the Southwest Libya area, the Palaeozoic deposits are an important petroleum system, with Silurian shale considered a hydrocarbon source rock and Cambro-Ordovician recognized as a good reservoir. The Palaeozoic petroleum system has the greatest potential for conventional and is thought to represent the significant prospect of unconventional petroleum resources in Southwest Libya. Until now, the lateral and vertical heterogeneity of the source rock was not well evaluated, and oil-source correlation is still a matter of debate. One source rock, which is considered the main source potential in Marzuq Basin, was investigated for its uranium contents using gamma-ray logs, rock-eval pyrolysis, and organic petrography for their bulk kinetic characteristics to determine the petroleum potential qualitatively and quantitatively. Thirty source rock samples and fifteen oil samples from the Tannezzuft source rock were analyzed by Rock-Eval Pyrolysis, microscopely investigation, GC, and GC-MS to detect acyclic isoprenoids and aliphatic, aromatic, and NSO biomarkers. Geochemistry tools were applied to screen source and age-significant biomarkers to high-spot genetic relationships. A grating heterogeneity exists among source rock zones from different levels of depth with varying uranium contents according to gamma-ray logs, rock-eval pyrolysis results, and kinetic features. The uranium-rich Tannezzuft Formations (Hot Shales) produce oils and oil-to-gas hydrocarbons based on their richness, kerogen type, and thermal maturity. Biomarker results such as C₂₇, C₂₈, and C₂₉ steranes concentrations and C₂₄ tetracyclic terpane/C₂₉ tricyclic terpane ratios, with sterane and hopane ratios, are considered the most promising biomarker information in differentiating within the Silurian Shale Tannezzuft Formation and in correlating with its expelled oils. The Tannezzuft Hot Shale is considered the main source rock for oil and gas accumulations in the Cambro-Ordovician reservoirs within the Marzuq Basin. Migration of the generated and expelled oil and gas from the Tannezzuft source rock to the reservoirs of the Cambro-Ordovician petroleum system was interpreted to have occurred along vertical and lateral pathways along the faults in the Palaeozoic Strata. The Upper Tannezzuft Formation (cold shale) is considered the primary seal in the Marzuq Basin.

Keywords: heterogeneity, hot shale, kerogen, Silurian, uranium

Procedia PDF Downloads 63
28197 A Hybrid Data-Handler Module Based Approach for Prioritization in Quality Function Deployment

Authors: P. Venu, Joeju M. Issac

Abstract:

Quality Function Deployment (QFD) is a systematic technique that creates a platform where the customer responses can be positively converted to design attributes. The accuracy of a QFD process heavily depends on the data that it is handling which is captured from customers or QFD team members. Customized computer programs that perform Quality Function Deployment within a stipulated time have been used by various companies across the globe. These programs heavily rely on storage and retrieval of the data on a common database. This database must act as a perfect source with minimum missing values or error values in order perform actual prioritization. This paper introduces a missing/error data handler module which uses Genetic Algorithm and Fuzzy numbers. The prioritization of customer requirements of sesame oil is illustrated and a comparison is made between proposed data handler module-based deployment and manual deployment.

Keywords: hybrid data handler, QFD, prioritization, module-based deployment

Procedia PDF Downloads 297