Search results for: large scale maps
12158 Evaluate the Possibility of Using ArcGIS Basemaps as GCP for Large Scale Maps
Authors: Jali Octariady, Ida Herliningsih, Ade K. Mulyana, Annisa Fitria, Diaz C. K. Yuwana
Abstract:
Awareness of the importance large-scale maps for development of a country is growing in all walks of life, especially for governments in Indonesia. Various parties, especially local governments throughout Indonesia demanded for immediate availability the large-scale maps of 1:5000 for regional development. But in fact, the large-scale maps of 1:5000 is only available less than 5% of the entire territory of Indonesia. Unavailability precise GCP at the entire territory of Indonesia is one of causes of slow availability the large scale maps of 1:5000. This research was conducted to find an alternative solution to this problem. This study was conducted to assess the accuracy of ArcGIS base maps coordinate when it shall be used as GCP for creating a map scale of 1:5000. The study was conducted by comparing the GCP coordinate from Field survey using GPS Geodetic than the coordinate from ArcGIS basemaps in various locations in Indonesia. Some areas are used as a study area are Lombok Island, Kupang City, Surabaya City and Kediri District. The differences value of the coordinates serve as the basis for assessing the accuracy of ArcGIS basemaps coordinates. The results of the study at various study area show the variation of the amount of the coordinates value given. Differences coordinate value in the range of millimeters (mm) to meters (m) in the entire study area. This is shown the inconsistency quality of ArcGIS base maps coordinates. This inconsistency shows that the coordinate value from ArcGIS Basemaps is careless. The Careless coordinate from ArcGIS Basemaps indicates that its cannot be used as GCP for large-scale mapping on the entire territory of Indonesia.Keywords: accuracy, ArcGIS base maps, GCP, large scale maps
Procedia PDF Downloads 37312157 Challenges and Opportunities: One Stop Processing for the Automation of Indonesian Large-Scale Topographic Base Map Using Airborne LiDAR Data
Authors: Elyta Widyaningrum
Abstract:
The LiDAR data acquisition has been recognizable as one of the fastest solution to provide the basis data for topographic base mapping in Indonesia. The challenges to accelerate the provision of large-scale topographic base maps as a development plan basis gives the opportunity to implement the automated scheme in the map production process. The one stop processing will also contribute to accelerate the map provision especially to conform with the Indonesian fundamental spatial data catalog derived from ISO 19110 and geospatial database integration. Thus, the automated LiDAR classification, DTM generation and feature extraction will be conducted in one GIS-software environment to form all layers of topographic base maps. The quality of automated topographic base map will be assessed and analyzed based on its completeness, correctness, contiguity, consistency and possible customization.Keywords: automation, GIS environment, LiDAR processing, map quality
Procedia PDF Downloads 36812156 Impact of Map Generalization in Spatial Analysis
Authors: Lin Li, P. G. R. N. I. Pussella
Abstract:
When representing spatial data and their attributes on different types of maps, the scale plays a key role in the process of map generalization. The process is consisted with two main operators such as selection and omission. Once some data were selected, they would undergo of several geometrical changing processes such as elimination, simplification, smoothing, exaggeration, displacement, aggregation and size reduction. As a result of these operations at different levels of data, the geometry of the spatial features such as length, sinuosity, orientation, perimeter and area would be altered. This would be worst in the case of preparation of small scale maps, since the cartographer has not enough space to represent all the features on the map. What the GIS users do is when they wanted to analyze a set of spatial data; they retrieve a data set and does the analysis part without considering very important characteristics such as the scale, the purpose of the map and the degree of generalization. Further, the GIS users use and compare different maps with different degrees of generalization. Sometimes, GIS users are going beyond the scale of the source map using zoom in facility and violate the basic cartographic rule 'it is not suitable to create a larger scale map using a smaller scale map'. In the study, the effect of map generalization for GIS analysis would be discussed as the main objective. It was used three digital maps with different scales such as 1:10000, 1:50000 and 1:250000 which were prepared by the Survey Department of Sri Lanka, the National Mapping Agency of Sri Lanka. It was used common features which were on above three maps and an overlay analysis was done by repeating the data with different combinations. Road data, River data and Land use data sets were used for the study. A simple model, to find the best place for a wild life park, was used to identify the effects. The results show remarkable effects on different degrees of generalization processes. It can see that different locations with different geometries were received as the outputs from this analysis. The study suggests that there should be reasonable methods to overcome this effect. It can be recommended that, as a solution, it would be very reasonable to take all the data sets into a common scale and do the analysis part.Keywords: generalization, GIS, scales, spatial analysis
Procedia PDF Downloads 32812155 Prediction of Fire Growth of the Office by Real-Scale Fire Experiment
Authors: Kweon Oh-Sang, Kim Heung-Youl
Abstract:
Estimating the engineering properties of fires is important to be prepared for the complex and various fire risks of large-scale structures such as super-tall buildings, large stadiums, and multi-purpose structures. In this study, a mock-up of a compartment which was 2.4(L) x 3.6 (W) x 2.4 (H) meter in dimensions was fabricated at the 10MW LSC (Large Scale Calorimeter) and combustible office supplies were placed in the compartment for a real-scale fire test. Maximum heat release rate was 4.1 MW and total energy release obtained through the application of t2 fire growth rate was 6705.9 MJ.Keywords: fire growth, fire experiment, t2 curve, large scale calorimeter
Procedia PDF Downloads 33812154 A Study of Closed Sets and Maps with Ideals
Authors: Asha Gupta, Ramandeep Kaur
Abstract:
The purpose of this paper is to study a class of closed sets, called generalized pre-closed sets with respect to an ideal (briefly Igp-closed sets), which is an extension of generalized pre-closed sets in general topology. Then, by using these sets, the concepts of Igp- compact spaces along with some classes of maps like continuous and closed maps via ideals have been introduced and analogues of some known results for compact spaces, continuous maps and closed maps in general topology have been obtained.Keywords: ideal, gp-closed sets, gp-closed maps, gp-continuous maps
Procedia PDF Downloads 22412153 RGB-D SLAM Algorithm Based on pixel level Dense Depth Map
Authors: Hao Zhang, Hongyang Yu
Abstract:
Scale uncertainty is a well-known challenging problem in visual SLAM. Because RGB-D sensor provides depth information, RGB-D SLAM improves this scale uncertainty problem. However, due to the limitation of physical hardware, the depth map output by RGB-D sensor usually contains a large area of missing depth values. These missing depth information affect the accuracy and robustness of RGB-D SLAM. In order to reduce these effects, this paper completes the missing area of the depth map output by RGB-D sensor and then fuses the completed dense depth map into ORB SLAM2. By adding the process of obtaining pixel-level dense depth maps, a better RGB-D visual SLAM algorithm is finally obtained. In the process of obtaining dense depth maps, a deep learning model of indoor scenes is adopted. Experiments are conducted on public datasets and real-world environments of indoor scenes. Experimental results show that the proposed SLAM algorithm has better robustness than ORB SLAM2.Keywords: RGB-D, SLAM, dense depth, depth map
Procedia PDF Downloads 14012152 Identification of Potential Large Scale Floating Solar Sites in Peninsular Malaysia
Authors: Nur Iffika Ruslan, Ahmad Rosly Abbas, Munirah Stapah@Salleh, Nurfaziera Rahim
Abstract:
Increased concerns and awareness of environmental hazards by fossil fuels burning for energy have become the major factor driving the transition toward green energy. It is expected that an additional of 2,000 MW of renewable energy is to be recorded from the renewable sources by 2025 following the implementation of Large Scale Solar projects in Peninsular Malaysia, including Large Scale Floating Solar projects. Floating Solar has better advantages over its landed counterparts such as the requirement for land acquisition is relatively insignificant. As part of the site selection process established by TNB Research Sdn. Bhd., a set of mandatory and rejection criteria has been developed in order to identify only sites that are feasible for the future development of Large Scale Floating Solar power plant. There are a total of 85 lakes and reservoirs identified within Peninsular Malaysia. Only lakes and reservoirs with a minimum surface area of 120 acres will be considered as potential sites for the development of Large Scale Floating Solar power plant. The result indicates a total of 10 potential Large Scale Floating Solar sites identified which are located in Selangor, Johor, Perak, Pulau Pinang, Perlis and Pahang. This paper will elaborate on the various mandatory and rejection criteria, as well as on the various site selection process required to identify potential (suitable) Large Scale Floating Solar sites in Peninsular Malaysia.Keywords: Large Scale Floating Solar, Peninsular Malaysia, Potential Sites, Renewable Energy
Procedia PDF Downloads 18112151 A Clustering Algorithm for Massive Texts
Authors: Ming Liu, Chong Wu, Bingquan Liu, Lei Chen
Abstract:
Internet users have to face the massive amount of textual data every day. Organizing texts into categories can help users dig the useful information from large-scale text collection. Clustering, in fact, is one of the most promising tools for categorizing texts due to its unsupervised characteristic. Unfortunately, most of traditional clustering algorithms lose their high qualities on large-scale text collection. This situation mainly attributes to the high- dimensional vectors generated from texts. To effectively and efficiently cluster large-scale text collection, this paper proposes a vector reconstruction based clustering algorithm. Only the features that can represent the cluster are preserved in cluster’s representative vector. This algorithm alternately repeats two sub-processes until it converges. One process is partial tuning sub-process, where feature’s weight is fine-tuned by iterative process. To accelerate clustering velocity, an intersection based similarity measurement and its corresponding neuron adjustment function are proposed and implemented in this sub-process. The other process is overall tuning sub-process, where the features are reallocated among different clusters. In this sub-process, the features useless to represent the cluster are removed from cluster’s representative vector. Experimental results on the three text collections (including two small-scale and one large-scale text collections) demonstrate that our algorithm obtains high quality on both small-scale and large-scale text collections.Keywords: vector reconstruction, large-scale text clustering, partial tuning sub-process, overall tuning sub-process
Procedia PDF Downloads 43512150 Dynamic Ad-hoc Topologies for Mobile Robot Navigation Based on Non-Uniform Grid Maps
Authors: Peter Sauer, Thomas Hinze, Petra Hofstedt
Abstract:
To avoid obstacles in the surrounding environment and to navigate to a given target belong to the most important tasks for mobile robots. According to these tasks different data structures are suitable. To avoid near obstacles, occupancy grid maps are an ideal representation of the surroundings. For less fine grained tasks, such as navigating from one room to another in an apartment, pure grid maps are inappropriate. Grid maps are very detailed, calculating paths to navigate between rooms based on grid maps would take too long. Instead, graph-based data structures, so-called topologies, turn out to be a proper choice for such tasks. In this paper we present two methods to dynamically create topologies from grid maps. Both methods are based on non-uniform grid maps. The topologies are generated on-the-fly and can easily be modified to represent changes in the environment. This allows a hybrid approach to control mobile robots, where, depending on the situation and the current task, either the grid map or the generated topology may be used.Keywords: robot navigation, occupancy grids, topological maps, dynamic map creation
Procedia PDF Downloads 56312149 The Map of Cassini: An Accurate View of Current Border Between Spain and France
Authors: Barbara Polo Martin
Abstract:
During the 18th century, the border between Spain and France underwent various changes, primarily due to territorial agreements, wars, and treaties between the two nations and other European powers. For studying these changes, the Cassini maps remain valuable historical documents, offering a glimpse into the landscape and geography of 18th-century France and its neighboring regions, including the border between Spain and France. However, it's essential to recognize that these maps may not reflect modern political boundaries or territorial changes that have occurred since their creation. The project was initiated by King Louis XV in 1744 and continued by his successor, Louis XVI. The primary objective was to produce accurate maps of France, which would serve various purposes, including military, administrative, and scientific. The Cassini maps were groundbreaking for their time, as they were among the earliest attempts to create topographic maps on a national scale. They covered the entirety of France and were based on meticulous surveying and cartographic techniques. The maps featured precise geographic details, including elevation contours, rivers, roads, forests, and settlements. This study aims to analyze this rich and unknown cartography of France, study the rich place names it offers, as well as the accuracy of delimitations created over time between both empires in a historical way but also through a Geographical Information System. This study will offer a deeper knowledge about the cartography that supposes the beginning of topography in Europe.Keywords: cartography, engineering, borders, Spain, France, Cassini
Procedia PDF Downloads 6012148 The Challenges of Scaling Agile to Large-Scale Distributed Development: An Overview of the Agile Factory Model
Authors: Bernard Doherty, Andrew Jelfs, Aveek Dasgupta, Patrick Holden
Abstract:
Many companies have moved to agile and hybrid agile methodologies where portions of the Software Design Life-cycle (SDLC) and Software Test Life-cycle (STLC) can be time boxed in order to enhance delivery speed, quality and to increase flexibility to changes in software requirements. Despite widespread proliferation of agile practices, implementation often fails due to lack of adequate project management support, decreased motivation or fear of increased interaction. Consequently, few organizations effectively adopt agile processes with tailoring often required to integrate agile methodology in large scale environments. This paper provides an overview of the challenges in implementing an innovative large-scale tailored realization of the agile methodology termed the Agile Factory Model (AFM), with the aim of comparing and contrasting issues of specific importance to organizations undertaking large scale agile development. The conclusions demonstrate that agile practices can be effectively translated to a globally distributed development environment.Keywords: agile, agile factory model, globally distributed development, large-scale agile
Procedia PDF Downloads 29412147 A World Map of Seabed Sediment Based on 50 Years of Knowledge
Authors: T. Garlan, I. Gabelotaud, S. Lucas, E. Marchès
Abstract:
Production of a global sedimentological seabed map has been initiated in 1995 to provide the necessary tool for searches of aircraft and boats lost at sea, to give sedimentary information for nautical charts, and to provide input data for acoustic propagation modelling. This original approach had already been initiated one century ago when the French hydrographic service and the University of Nancy had produced maps of the distribution of marine sediments of the French coasts and then sediment maps of the continental shelves of Europe and North America. The current map of the sediment of oceans presented was initiated with a UNESCO's general map of the deep ocean floor. This map was adapted using a unique sediment classification to present all types of sediments: from beaches to the deep seabed and from glacial deposits to tropical sediments. In order to allow good visualization and to be adapted to the different applications, only the granularity of sediments is represented. The published seabed maps are studied, if they present an interest, the nature of the seabed is extracted from them, the sediment classification is transcribed and the resulted map is integrated in the world map. Data come also from interpretations of Multibeam Echo Sounder (MES) imagery of large hydrographic surveys of deep-ocean. These allow a very high-quality mapping of areas that until then were represented as homogeneous. The third and principal source of data comes from the integration of regional maps produced specifically for this project. These regional maps are carried out using all the bathymetric and sedimentary data of a region. This step makes it possible to produce a regional synthesis map, with the realization of generalizations in the case of over-precise data. 86 regional maps of the Atlantic Ocean, the Mediterranean Sea, and the Indian Ocean have been produced and integrated into the world sedimentary map. This work is permanent and permits a digital version every two years, with the integration of some new maps. This article describes the choices made in terms of sediment classification, the scale of source data and the zonation of the variability of the quality. This map is the final step in a system comprising the Shom Sedimentary Database, enriched by more than one million punctual and surface items of data, and four series of coastal seabed maps at 1:10,000, 1:50,000, 1:200,000 and 1:1,000,000. This step by step approach makes it possible to take into account the progresses in knowledge made in the field of seabed characterization during the last decades. Thus, the arrival of new classification systems for seafloor has improved the recent seabed maps, and the compilation of these new maps with those previously published allows a gradual enrichment of the world sedimentary map. But there is still a lot of work to enhance some regions, which are still based on data acquired more than half a century ago.Keywords: marine sedimentology, seabed map, sediment classification, world ocean
Procedia PDF Downloads 23212146 Study of Large-Scale Atmospheric Convection over the Tropical Indian Ocean and Its Association with Oceanic Variables
Authors: Supriya Manikrao Ovhal
Abstract:
In India, the summer monsoon rainfall occurs owing to large scale convection with reference to continental ITCZ. It was found that convection over tropical ocean increases with SST from 26 to 28 degree C, and when SST is above 29 degree C, it sharply decreases for warm pool areas of Indian and for monsoon areas of West Pacific Ocean. The reduction in convection can be influenced by large scale subsidence forced by nearby or remotely generated deep convection, thus it was observed that under the influence of strong large scale rising motion, convection does not decreases but increases monotonically with SST even if SST value is higher than 29.5 degree C. Since convection is related to SST gradient, that helps to generate low level moisture convergence and upward vertical motion in the atmosphere. Strong wind fields like cross equatorial low level jet stream on equator ward side of the warm pool are produced due to convection initiated by SST gradient. Areas having maximum SST have low SST gradient, and that result in feeble convection. Hence it is imperative to mention that the oceanic role (other than SST) could be prominent in influencing large Scale Atmospheric convection. Since warm oceanic surface somewhere or the other contributes to penetrate the heat radiation to the subsurface of the ocean, and as there is no studies seen related to oceanic subsurface role in large Scale Atmospheric convection, in the present study, we are concentrating on the oceanic subsurface contribution in large Scale Atmospheric convection by considering the SST gradient, mixed layer depth (MLD), thermocline, barrier layer. The present study examines the probable role of subsurface ocean parameters in influencing convection. Procedia PDF Downloads 9312145 Thermal Network Model for a Large Scale AC Induction Motor
Authors: Sushil Kumar, M. Dakshina Murty
Abstract:
Thermal network modelling has proven to be important tool for thermal analysis of electrical machine. This article investigates numerical thermal network model and experimental performance of a large-scale AC motor. Experimental temperatures were measured using RTD in the stator which have been compared with the numerical data. Thermal network modelling fairly predicts the temperature of various components inside the large-scale AC motor. Results of stator winding temperature is compared with experimental results which are in close agreement with accuracy of 6-10%. This method of predicting hot spots within AC motors can be readily used by the motor designers for estimating the thermal hot spots of the machine.Keywords: AC motor, thermal network, heat transfer, modelling
Procedia PDF Downloads 32612144 Deproteinization of Moroccan Sardine (Sardina pilchardus) Scales: A Pilot-Scale Study
Authors: F. Bellali, M. Kharroubi, Y. Rady, N. Bourhim
Abstract:
In Morocco, fish processing industry is an important source income for a large amount of by-products including skins, bones, heads, guts, and scales. Those underutilized resources particularly scales contain a large amount of proteins and calcium. Sardina plichardus scales from resulting from the transformation operation have the potential to be used as raw material for the collagen production. Taking into account this strong expectation of the regional fish industry, scales sardine upgrading is well justified. In addition, political and societal demands for sustainability and environment-friendly industrial production systems, coupled with the depletion of fish resources, drive this trend forward. Therefore, fish scale used as a potential source to isolate collagen has a wide large of applications in food, cosmetic, and biomedical industry. The main aim of this study is to isolate and characterize the acid solubilize collagen from sardine fish scale, Sardina pilchardus. Experimental design methodology was adopted in collagen processing for extracting optimization. The first stage of this work is to investigate the optimization conditions of the sardine scale deproteinization on using response surface methodology (RSM). The second part focus on the demineralization with HCl solution or EDTA. And the last one is to establish the optimum condition for the isolation of collagen from fish scale by solvent extraction. The advancement from lab scale to pilot scale is a critical stage in the technological development. In this study, the optimal condition for the deproteinization which was validated at laboratory scale was employed in the pilot scale procedure. The deproteinization of fish scale was then demonstrated on a pilot scale (2Kg scales, 20l NaOH), resulting in protein content (0,2mg/ml) and hydroxyproline content (2,11mg/l). These results indicated that the pilot-scale showed similar performances to those of lab-scale one.Keywords: deproteinization, pilot scale, scale, sardine pilchardus
Procedia PDF Downloads 44612143 A Survey on Data-Centric and Data-Aware Techniques for Large Scale Infrastructures
Authors: Silvina Caíno-Lores, Jesús Carretero
Abstract:
Large scale computing infrastructures have been widely developed with the core objective of providing a suitable platform for high-performance and high-throughput computing. These systems are designed to support resource-intensive and complex applications, which can be found in many scientific and industrial areas. Currently, large scale data-intensive applications are hindered by the high latencies that result from the access to vastly distributed data. Recent works have suggested that improving data locality is key to move towards exascale infrastructures efficiently, as solutions to this problem aim to reduce the bandwidth consumed in data transfers, and the overheads that arise from them. There are several techniques that attempt to move computations closer to the data. In this survey we analyse the different mechanisms that have been proposed to provide data locality for large scale high-performance and high-throughput systems. This survey intends to assist scientific computing community in understanding the various technical aspects and strategies that have been reported in recent literature regarding data locality. As a result, we present an overview of locality-oriented techniques, which are grouped in four main categories: application development, task scheduling, in-memory computing and storage platforms. Finally, the authors include a discussion on future research lines and synergies among the former techniques.Keywords: data locality, data-centric computing, large scale infrastructures, cloud computing
Procedia PDF Downloads 25912142 Brokerage and Value-Creation: Trading Practices in the English Market of 20th-Century Maps
Authors: Shaun Lim
Abstract:
This paper presents a 9-month ethnographic case study of the value creating strategies employed by an Oxford market-trader of 20th-century maps. Maps are usually valued and sold as either antique objets d’art or useful navigational tools, with 20th-century maps precariously lying between the boundary of the aesthetic and utilitarian value-regimes. Here, the brokerage practices involved in the framing of outdated, lowly valued maps into vintage commodities will be examined. Ethnographic material of the unstudied market of old maps is introduced and situated in the second-hand, antique and collectible spheres of exchange. The map-trader as a broker is the ethnographic and methodological starting point of this paper. Brokerage is understood through the activity of framing that defines and brackets the value-regimes of commodities with the aid of market and framing devices. The trader’s activities will be examined in three parts. (1) The post-sourcing industry: the altering, mounting and tagging of maps before putting them into market circulation. Mounts, frames and tags are seen as market devices that authenticates and frames maps with aesthetic and symbolic values along with the disentanglement of its use value. (2) The market-display: the constitution of space that encourages the relations of looking at maps as aesthetic objects, while the categorical arrangement of the display contributes to legitimising of the collectability of maps. (3) The salesmanship strategies of the trader: the match-making of customers with maps of meaningful value, and the mediating of knowledge through the verbal articulation of the map’s symbolic values. Ultimately, value is not created in an accumulative sense, but is layered and superimposed to cater to a wide spectrum of patrons. The trader creates demand for his goods by mediating and articulating value-regimes already coherent to potential patrons.Keywords: art and material culture, brokerage, commodification, framing, markets, value
Procedia PDF Downloads 14612141 System Survivability in Networks in the Context of Defense/Attack Strategies: The Large Scale
Authors: Asma Ben Yaghlane, Mohamed Naceur Azaiez, Mehdi Mrad
Abstract:
We investigate the large scale of networks in the context of network survivability under attack. We use appropriate techniques to evaluate and the attacker-based- and the defender-based-network survivability. The attacker is unaware of the operated links by the defender. Each attacked link has some pre-specified probability to be disconnected. The defender choice is so that to maximize the chance of successfully sending the flow to the destination node. The attacker however will select the cut-set with the highest chance to be disabled in order to partition the network. Moreover, we extend the problem to the case of selecting the best p paths to operate by the defender and the best k cut-sets to target by the attacker, for arbitrary integers p,k > 1. We investigate some variations of the problem and suggest polynomial-time solutions.Keywords: defense/attack strategies, large scale, networks, partitioning a network
Procedia PDF Downloads 28312140 A Modified Nonlinear Conjugate Gradient Algorithm for Large Scale Unconstrained Optimization Problems
Authors: Tsegay Giday Woldu, Haibin Zhang, Xin Zhang, Yemane Hailu Fissuh
Abstract:
It is well known that nonlinear conjugate gradient method is one of the widely used first order methods to solve large scale unconstrained smooth optimization problems. Because of the low memory requirement, attractive theoretical features, practical computational efficiency and nice convergence properties, nonlinear conjugate gradient methods have a special role for solving large scale unconstrained optimization problems. Large scale optimization problems are with important applications in practical and scientific world. However, nonlinear conjugate gradient methods have restricted information about the curvature of the objective function and they are likely less efficient and robust compared to some second order algorithms. To overcome these drawbacks, the new modified nonlinear conjugate gradient method is presented. The noticeable features of our work are that the new search direction possesses the sufficient descent property independent of any line search and it belongs to a trust region. Under mild assumptions and standard Wolfe line search technique, the global convergence property of the proposed algorithm is established. Furthermore, to test the practical computational performance of our new algorithm, numerical experiments are provided and implemented on the set of some large dimensional unconstrained problems. The numerical results show that the proposed algorithm is an efficient and robust compared with other similar algorithms.Keywords: conjugate gradient method, global convergence, large scale optimization, sufficient descent property
Procedia PDF Downloads 20512139 Buzan Mind Mapping: An Efficient Technique for Note-Taking
Authors: T. K. Tee, M. N. A. Azman, S. Mohamed, M. Muhammad, M. M. Mohamad, J. Md Yunos, M. H. Yee, W. Othman
Abstract:
Buzan mind mapping is an efficient system of note-taking that makes revision a fun thing to do for students. Tony Buzan has been teaching children all over the world for the past thirty years and has proved that mind maps are the magic formula in the classroom for everyone. The purpose of this paper is to discuss the importance of Buzan mind mapping as a note-taking technique for the secondary school students. This paper also examines the mind mapping technique, advantages and disadvantages of hand-drawn mind maps. Samples of students’ mind maps were presented and discussed.Keywords: Buzan mind mapping, note-taking technique, hand-drawn, mind maps
Procedia PDF Downloads 53812138 A Novel Probabilistic Spatial Locality of Reference Technique for Automatic Cleansing of Digital Maps
Authors: A. Abdullah, S. Abushalmat, A. Bakshwain, A. Basuhail, A. Aslam
Abstract:
GIS (Geographic Information System) applications require geo-referenced data, this data could be available as databases or in the form of digital or hard-copy agro-meteorological maps. These parameter maps are color-coded with different regions corresponding to different parameter values, converting these maps into a database is not very difficult. However, text and different planimetric elements overlaid on these maps makes an accurate image to database conversion a challenging problem. The reason being, it is almost impossible to exactly replace what was underneath the text or icons; thus, pointing to the need for inpainting. In this paper, we propose a probabilistic inpainting approach that uses the probability of spatial locality of colors in the map for replacing overlaid elements with underlying color. We tested the limits of our proposed technique using non-textual simulated data and compared text removing results with a popular image editing tool using public domain data with promising results.Keywords: noise, image, GIS, digital map, inpainting
Procedia PDF Downloads 35212137 Evaluation of Hydrocarbon Prospects of 'ADE' Field, Niger Delta
Authors: Oluseun A. Sanuade, Sanlinn I. Kaka, Adesoji O. Akanji, Olukole A. Akinbiyi
Abstract:
Prospect evaluation of ‘the ‘ADE’ field was done using 3D seismic data and well log data. The field is located in the offshore Niger Delta where water depth ranges from 450 to 800 m. The objectives of this study are to explore deeper prospects and to ascertain the kind of traps that are favorable for the accumulation of hydrocarbon in the field. Six horizons with major and minor faults were identified and mapped in the field. Time structure maps of these horizons were generated and using the available check-shot data the maps were converted to top structure maps which were used to calculate the hydrocarbon volume. The results show that regional structural highs that are trending in northeast-southwest (NE-SW) characterized a large portion of the field. These highs were observed across all horizons revealing a regional post-depositional deformation. Three prospects were identified and evaluated to understand the different opportunities in the field. These include stratigraphic pinch out and bi-directional downlap. The results of this study show that the field has potentials for new opportunities that could be explored for further studies.Keywords: hydrocarbon, play, prospect, stratigraphy
Procedia PDF Downloads 26912136 Flood Disaster Prevention and Mitigation in Nigeria Using Geographic Information System
Authors: Dinebari Akpee, Friday Aabe Gaage, Florence Fred Nwaigwu
Abstract:
Natural disasters like flood affect many parts of the world including developing countries like Nigeria. As a result, many human lives are lost, properties damaged and so much money is lost in infrastructure damages. These hazards and losses can be mitigated and reduced by providing reliable spatial information to the generality of the people through about flood risks through flood inundation maps. Flood inundation maps are very crucial for emergency action plans, urban planning, ecological studies and insurance rates. Nigeria experience her worst flood in her entire history this year. Many cities were submerged and completely under water due to torrential rainfall. Poor city planning, lack of effective development control among others contributes to the problem too. Geographic information system (GIS) can be used to visualize the extent of flooding, analyze flood maps to produce flood damaged estimation maps and flood risk maps. In this research, the under listed steps were taken in preparation of flood risk maps for the study area: (1) Digitization of topographic data and preparation of digital elevation model using ArcGIS (2) Flood simulation using hydraulic model and integration and (3) Integration of the first two steps to produce flood risk maps. The results shows that GIS can play crucial role in Flood disaster control and mitigation.Keywords: flood disaster, risk maps, geographic information system, hazards
Procedia PDF Downloads 22712135 Performance Analysis of Routing Protocols for WLAN Based Wireless Sensor Networks (WSNs)
Authors: Noman Shabbir, Roheel Nawaz, Muhammad N. Iqbal, Junaid Zafar
Abstract:
This paper focuses on the performance evaluation of routing protocols in WLAN based Wireless Sensor Networks (WSNs). A comparative analysis of routing protocols such as Ad-hoc On-demand Distance Vector Routing System (AODV), Dynamic Source Routing (DSR) and Optimized Link State Routing (OLSR) is been made against different network parameters like network load, end to end delay and throughput in small, medium and large-scale sensor network scenarios to identify the best performing protocol. Simulation results indicate that OLSR gives minimum network load in all three scenarios while AODV gives the best throughput in small scale network but in medium and large scale networks, DSR is better. In terms of delay, OLSR is more efficient in small and medium scale network while AODV is slightly better in large networks.Keywords: WLAN, WSN, AODV, DSR, OLSR
Procedia PDF Downloads 44812134 The Importance of Localization in Large Constraction Projects
Authors: Ali Mohammadi
Abstract:
The basis for the construction of any project is a map, a map where the surveyor can determine the coordinates of the points on the ground by using the coordinates and using the total station, projects such as dams, roads, tunnels and pipelines, if the base points are determined using GPS prepared can create challenges for the surveyor to control. First, we will examine some map projection on which the maps are designed, and a summary of their differences and the challenges that surveyors face in order to control them, because in order to build projects, we need real lengths and angles, so we have to use coordinates that provide us with the results of the calculations. We will examine some examples to understand the concept of localization so that the surveyor knows if he is facing a challenge or not and if he is faced with this challenge, how should he solve this problem.Keywords: UTM, scale factor, cartesian, traverse
Procedia PDF Downloads 8012133 Optimization Model for Identification of Assembly Alternatives of Large-Scale, Make-to-Order Products
Authors: Henrik Prinzhorn, Peter Nyhuis, Johannes Wagner, Peter Burggräf, Torben Schmitz, Christina Reuter
Abstract:
Assembling large-scale products, such as airplanes, locomotives, or wind turbines, involves frequent process interruptions induced by e.g. delayed material deliveries or missing availability of resources. This leads to a negative impact on the logistical performance of a producer of xxl-products. In industrial practice, in case of interruptions, the identification, evaluation and eventually the selection of an alternative order of assembly activities (‘assembly alternative’) leads to an enormous challenge, especially if an optimized logistical decision should be reached. Therefore, in this paper, an innovative, optimization model for the identification of assembly alternatives that addresses the given problem is presented. It describes make-to-order, large-scale product assembly processes as a resource constrained project scheduling (RCPS) problem which follows given restrictions in practice. For the evaluation of the assembly alternative, a cost-based definition of the logistical objectives (delivery reliability, inventory, make-span and workload) is presented.Keywords: assembly scheduling, large-scale products, make-to-order, optimization, rescheduling
Procedia PDF Downloads 45912132 Application of Neutron-Gamma Technologies for Soil Elemental Content Determination and Mapping
Authors: G. Yakubova, A. Kavetskiy, S. A. Prior, H. A. Torbert
Abstract:
In-situ soil carbon determination over large soil surface areas (several hectares) is required in regard to carbon sequestration and carbon credit issues. This capability is important for optimizing modern agricultural practices and enhancing soil science knowledge. Collecting and processing representative field soil cores for traditional laboratory chemical analysis is labor-intensive and time-consuming. The neutron-stimulated gamma analysis method can be used for in-situ measurements of primary elements in agricultural soils (e.g., Si, Al, O, C, Fe, and H). This non-destructive method can assess several elements in large soil volumes with no need for sample preparation. Neutron-gamma soil elemental analysis utilizes gamma rays issued from different neutron-nuclei interactions. This process has become possible due to the availability of commercial portable pulse neutron generators, high-efficiency gamma detectors, reliable electronics, and measurement/data processing software complimented by advances in state-of-the-art nuclear physics methods. In Pulsed Fast Thermal Neutron Analysis (PFTNA), soil irradiation is accomplished using a pulsed neutron flux, and gamma spectra acquisition occurs both during and between pulses. This method allows the inelastic neutron scattering (INS) gamma spectrum to be separated from the thermal neutron capture (TNC) spectrum. Based on PFTNA, a mobile system for field-scale soil elemental determinations (primarily carbon) was developed and constructed. Our scanning methodology acquires data that can be directly used for creating soil elemental distribution maps (based on ArcGIS software) in a reasonable timeframe (~20-30 hectares per working day). Created maps are suitable for both agricultural purposes and carbon sequestration estimates. The measurement system design, spectra acquisition process, strategy for acquiring field-scale carbon content data, and mapping of agricultural fields will be discussed.Keywords: neutron gamma analysis, soil elemental content, carbon sequestration, carbon credit, soil gamma spectroscopy, portable neutron generators, ArcMap mapping
Procedia PDF Downloads 9012131 The Use of the Matlab Software as the Best Way to Recognize Penumbra Region in Radiotherapy
Authors: Alireza Shayegan, Morteza Amirabadi
Abstract:
The y tool was developed to quantitatively compare dose distributions, either measured or calculated. Before computing ɣ, the dose and distance scales of the two distributions, referred to as evaluated and reference, are re-normalized by dose and distance criteria, respectively. The re-normalization allows the dose distribution comparison to be conducted simultaneously along dose and distance axes. Several two-dimensional images were acquired using a Scanning Liquid Ionization Chamber EPID and Extended Dose Range (EDR2) films for regular and irregular radiation fields. The raw images were then converted into two-dimensional dose maps. Transitional and rotational manipulations were performed for images using Matlab software. As evaluated dose distribution maps, they were then compared with the corresponding original dose maps as the reference dose maps.Keywords: energetic electron, gamma function, penumbra, Matlab software
Procedia PDF Downloads 30112130 An Evaluation Model for Automatic Map Generalization
Authors: Quynhan Tran, Hong Fan, Quockhanh Pham
Abstract:
Automatic map generalization is a well-known problem in cartography. The development of map generalization research accompanied the development of cartography. The traditional map is plotted manually by cartographic experts. The paper studies none-scale automation generalization of resident polygons and house marker symbol, proposes methodology to evaluate the result maps based on minimal spanning tree. In this paper, the minimal spanning tree before and after map generalization is compared to evaluate whether the generalization result maintain the geographical distribution of features. The minimal spanning tree in vector format is firstly converted into a raster format and the grid size is 2mm (distance on the map). The statistical number of matching grid before and after map generalization and the ratio of overlapping grid to the total grids is calculated. Evaluation experiments are conduct to verify the results. Experiments show that this methodology can give an objective evaluation for the feature distribution and give specialist an hand while they evaluate result maps of none-scale automation generalization with their eyes.Keywords: automatic cartography generalization, evaluation model, geographic feature distribution, minimal spanning tree
Procedia PDF Downloads 63612129 Numerical Simulation of Large-Scale Landslide-Generated Impulse Waves With a Soil‒Water Coupling Smooth Particle Hydrodynamics Model
Authors: Can Huang, Xiaoliang Wang, Qingquan Liu
Abstract:
Soil‒water coupling is an important process in landslide-generated impulse waves (LGIW) problems, accompanied by large deformation of soil, strong interface coupling and three-dimensional effect. A meshless particle method, smooth particle hydrodynamics (SPH) has great advantages in dealing with complex interface and multiphase coupling problems. This study presents an improved soil‒water coupled model to simulate LGIW problems based on an open source code DualSPHysics (v4.0). Aiming to solve the low efficiency problem in modeling real large-scale LGIW problems, graphics processing unit (GPU) acceleration technology is implemented into this code. An experimental example, subaerial landslide-generated water waves, is simulated to demonstrate the accuracy of this model. Then, the Huangtian LGIW, a real large-scale LGIW problem is modeled to reproduce the entire disaster chain, including landslide dynamics, fluid‒solid interaction, and surge wave generation. The convergence analysis shows that a particle distance of 5.0 m can provide a converged landslide deposit and surge wave for this example. Numerical simulation results are in good agreement with the limited field survey data. The application example of the Huangtian LGIW provides a typical reference for large-scale LGIW assessments, which can provide reliable information on landslide dynamics, interface coupling behavior, and surge wave characteristics.Keywords: soil‒water coupling, landslide-generated impulse wave, large-scale, SPH
Procedia PDF Downloads 64