Search results for: large scale mapping
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12156

Search results for: large scale mapping

11976 A Novel Computer-Generated Hologram (CGH) Achieved Scheme Generated from Point Cloud by Using a Lens Array

Authors: Wei-Na Li, Mei-Lan Piao, Nam Kim

Abstract:

We proposed a novel computer-generated hologram (CGH) achieved scheme, wherein the CGH is generated from a point cloud which is transformed by a mapping relationship of a series of elemental images captured from a real three-dimensional (3D) object by using a lens array. This scheme is composed of three procedures: mapping from elemental images to point cloud, hologram generation, and hologram display. A mapping method is figured out to achieve a virtual volume date (point cloud) from a series of elemental images. This mapping method consists of two steps. Firstly, the coordinate (x, y) pairs and its appearing number are calculated from the series of sub-images, which are generated from the elemental images. Secondly, a series of corresponding coordinates (x, y, z) are calculated from the elemental images. Then a hologram is generated from the volume data that is calculated by the previous two steps. Eventually, a spatial light modulator (SLM) and a green laser beam are utilized to display this hologram and reconstruct the original 3D object. In this paper, in order to show a more auto stereoscopic display of a real 3D object, we successfully obtained the actual depth data of every discrete point of the real 3D object, and overcame the inherent drawbacks of the depth camera by obtaining point cloud from the elemental images.

Keywords: elemental image, point cloud, computer-generated hologram (CGH), autostereoscopic display

Procedia PDF Downloads 559
11975 Bubble Scrum: How to Run in Organizations That Only Know How to Walk

Authors: Zaheer A. Ali, George Szabo

Abstract:

SCRUM has roots in software and web development and works very well on that in that space. However, any technical person who has watched a typical waterfall managed project spiral out of control or into an abyss, has thought: "there must be a better way". I will discuss how that thought leads naturally to adopting Agile principles and SCRUM, as well as how Agile and SCRUM can be implemented in large institutions with long histories via a method I developed: Bubble Scrum. We will also see how SCRUM can be implemented in interesting places outside of the technical sphere and also discuss where and how to subtly bring Agility and SCRUM into large, rigid, institutions.

Keywords: agile, enterprise-agile, agile at scale, agile transition, project management, scrum

Procedia PDF Downloads 134
11974 Experimental Investigation of Fluid Dynamic Effects on Crystallisation Scale Growth and Suppression in Agitation Tank

Authors: Prasanjit Das, M. M. K. Khan, M. G. Rasul, Jie Wu, I. Youn

Abstract:

Mineral scale formation is undoubtedly a more serious problem in the mineral industry than other process industries. To better understand scale growth and suppression, an experimental model is proposed in this study for supersaturated crystallised solutions commonly found in mineral process plants. In this experiment, surface crystallisation of potassium nitrate (KNO3) on the wall of the agitation tank and agitation effects on the scale growth and suppression are studied. The new quantitative scale suppression model predicts that at lower agitation speed, the scale growth rate is enhanced and at higher agitation speed, the scale suppression rate increases due to the increased flow erosion effect. A lab-scale agitation tank with and without baffles were used as a benchmark in this study. The fluid dynamic effects on scale growth and suppression in the agitation tank with three different size impellers (diameter 86, 114, 160 mm and model A310 with flow number 0.56) at various ranges of rotational speed (up to 700 rpm) and solution with different concentration (4.5, 4.75 and 5.25 mol/dm3) were investigated. For more elucidation, the effects of the different size of the impeller on wall surface scale growth and suppression rate as well as bottom settled scale accumulation rate are also discussed. Emphasis was placed on applications in the mineral industry, although results are also relevant to other industrial applications.

Keywords: agitation tank, crystallisation, impeller speed, scale

Procedia PDF Downloads 194
11973 Flood Mapping and Inoudation on Weira River Watershed (in the Case of Hadiya Zone, Shashogo Woreda)

Authors: Alilu Getahun Sulito

Abstract:

Exceptional floods are now prevalent in many places in Ethiopia, resulting in a large number of human deaths and property destruction. Lake Boyo watershed, in particular, had also traditionally been vulnerable to flash floods throughout the Boyo watershed. The goal of this research is to create flood and inundation maps for the Boyo Catchment. The integration of Geographic information system(GIS) technology and the hydraulic model (HEC-RAS) were utilized as methods to attain the objective. The peak discharge was determined using Fuller empirical methodology for intervals of 5, 10, 15, and 25 years, and the results were 103.2 m3/s, 158 m3/s, 222 m3/s, and 252 m3/s, respectively. River geometry, boundary conditions, manning's n value of varying land cover, and peak discharge at various return periods were all entered into HEC-RAS, and then an unsteady flow study was performed. The results of the unsteady flow study demonstrate that the water surface elevation in the longitudinal profile rises as the different periods increase. The flood inundation charts clearly show that regions on the right and left sides of the river with the greatest flood coverage were 15.418 km2 and 5.29 km2, respectively, flooded by 10,20,30, and 50 years. High water depths typically occur along the main channel and progressively spread to the floodplains. The latest study also found that flood-prone areas were disproportionately affected on the river's right bank. As a result, combining GIS with hydraulic modelling to create a flood inundation map is a viable solution. The findings of this study can be used to care again for the right bank of a Boyo River catchment near the Boyo Lake kebeles, according to the conclusion. Furthermore, it is critical to promote an early warning system in the kebeles so that people can be evacuated before a flood calamity happens. Keywords: Flood, Weira River, Boyo, GIS, HEC- GEORAS, HEC- RAS, Inundation Mapping

Keywords: Weira River, Boyo, GIS, HEC- GEORAS, HEC- RAS, Inundation Mapping

Procedia PDF Downloads 33
11972 Investigation of the Stability of the F* Iterative Algorithm on Strong Peudocontractive Mappings and Its Applications

Authors: Felix Damilola Ajibade, Opeyemi O. Enoch, Taiwo Paul Fajusigbe

Abstract:

This paper is centered on conducting an inquiry into the stability of the F* iterative algorithm to the fixed point of a strongly pseudo-contractive mapping in the framework of uniformly convex Banach spaces. To achieve the desired result, certain existing inequalities in convex Banach spaces were utilized, as well as the stability criteria of Harder and Hicks. Other necessary conditions for the stability of the F* algorithm on strong pseudo-contractive mapping were also obtained. Through a numerical approach, we prove that the F* iterative algorithm is H-stable for strongly pseudo-contractive mapping. Finally, the solution of the mixed-type Volterra-Fredholm functional non-linear integral equation is estimated using our results.

Keywords: stability, F* -iterative algorithm, pseudo-contractive mappings, uniformly convex Banach space, mixed-type Volterra-Fredholm integral equation

Procedia PDF Downloads 77
11971 MapReduce Logistic Regression Algorithms with RHadoop

Authors: Byung Ho Jung, Dong Hoon Lim

Abstract:

Logistic regression is a statistical method for analyzing a dataset in which there are one or more independent variables that determine an outcome. Logistic regression is used extensively in numerous disciplines, including the medical and social science fields. In this paper, we address the problem of estimating parameters in the logistic regression based on MapReduce framework with RHadoop that integrates R and Hadoop environment applicable to large scale data. There exist three learning algorithms for logistic regression, namely Gradient descent method, Cost minimization method and Newton-Rhapson's method. The Newton-Rhapson's method does not require a learning rate, while gradient descent and cost minimization methods need to manually pick a learning rate. The experimental results demonstrated that our learning algorithms using RHadoop can scale well and efficiently process large data sets on commodity hardware. We also compared the performance of our Newton-Rhapson's method with gradient descent and cost minimization methods. The results showed that our newton's method appeared to be the most robust to all data tested.

Keywords: big data, logistic regression, MapReduce, RHadoop

Procedia PDF Downloads 254
11970 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products

Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry

Abstract:

The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.

Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively

Procedia PDF Downloads 63
11969 Research on Quality Assurance in African Higher Education: A Bibliometric Mapping from 1999 to 2019

Authors: Luís M. João, Patrício Langa

Abstract:

The article reviews the literature on quality assurance (QA) in African higher education studies (HES) conducted through a bibliometric mapping of published papers between 1999 and 2019. Specifically, the article highlights the nuances of knowledge production in four scientific databases: Scopus, Web of Science (WoS), African Journal Online (AJOL), and Google Scholar. The analysis included 531 papers, of which 127 are from Scopus, 30 are from Web of Science, 85 are from African Journal Online, and 259 are from Google Scholar. In essence, 284 authors wrote these papers from 231 institutions and 69 different countries (i.e., Africa=54 and outside Africa=15). Results indicate the existing knowledge. This analysis allows the readers to understand the growth and development of the field during the two-decade period, identify key contributors, and observe potential trends or gaps in the research. The paper employs bibliometric mapping as its primary analytical lens. By utilizing this method, the study quantitatively assesses the publications related to QA in African HES, helping to identify patterns, collaboration networks, and disparities in research output. The bibliometric approach allows for a systematic and objective analysis of large datasets, offering a comprehensive view of the knowledge production in the field. Furthermore, the study highlights the lack of shared resources available to enhance quality in higher education institutions (HEIs) in Africa. This finding underscores the importance of promoting collaborative research efforts, knowledge exchange, and capacity building within the region to improve the overall quality of higher education. The paper argues that despite the growing quantity of QA research in African higher education, there are challenges related to citation impact and access to high-impact publication avenues for African researchers. It emphasises the need to promote collaborative research and resource-sharing to enhance the quality of HEIs in Africa. The analytical lenses of bibliometric mapping and the examination of publication players' scenarios contribute to a comprehensive understanding of the field and its implications for African higher education.

Keywords: Africa, bibliometric research, higher education studies, quality assurance, scientific database, systematic review

Procedia PDF Downloads 27
11968 The Development of the Self-concept Scale for Elders in Taiwan

Authors: Ting-Chia Lien, Tzu-Yin Yen, Szu-Fan Chen, Tai-chun Kuo, Hung-Tse Lin, Yi-Chen Chung, Hock-Sen Gwee

Abstract:

The purpose of this study was to explore the result of the survey by developing “Self-Concept Scale for Elders”, which could provide community counseling and guidance institution for practical application. The sample of this study consisted of 332 elders in Taiwan (male: 33.4%; female: 66.6%). The mean age of participants was 65-98 years. The measurements applied in this study is “Self-Concept Scale for Elders”. After item and factor analyses, the preliminary version of the Self-Concept Scale for Elders was revised to the final version. The results were summarized as follows: 1) There were 10 items in Self-Concept Scale for Elders. 2) The variance explained for the scale accounted for 77.15%, with corrected item-total correlations Cronbach’s alpha=0.87. 3) The content validity, criterion validity and construct validity have been found to be satisfactory. Based on the findings, the implication and suggestions are offered for reference regarding counselor education and future research.

Keywords: self-concept, elder, development scale, applied psychology

Procedia PDF Downloads 545
11967 Bag of Local Features for Person Re-Identification on Large-Scale Datasets

Authors: Yixiu Liu, Yunzhou Zhang, Jianning Chi, Hao Chu, Rui Zheng, Libo Sun, Guanghao Chen, Fangtong Zhou

Abstract:

In the last few years, large-scale person re-identification has attracted a lot of attention from video surveillance since it has a potential application prospect in public safety management. However, it is still a challenging job considering the variation in human pose, the changing illumination conditions and the lack of paired samples. Although the accuracy has been significantly improved, the data dependence of the sample training is serious. To tackle this problem, a new strategy is proposed based on bag of visual words (BoVW) model of designing the feature representation which has been widely used in the field of image retrieval. The local features are extracted, and more discriminative feature representation is obtained by cross-view dictionary learning (CDL), then the assignment map is obtained through k-means clustering. Finally, the BoVW histograms are formed which encodes the images with the statistics of the feature classes in the assignment map. Experiments conducted on the CUHK03, Market1501 and MARS datasets show that the proposed method performs favorably against existing approaches.

Keywords: bag of visual words, cross-view dictionary learning, person re-identification, reranking

Procedia PDF Downloads 171
11966 Developing NAND Flash-Memory SSD-Based File System Design

Authors: Jaechun No

Abstract:

This paper focuses on I/O optimizations of N-hybrid (New-Form of hybrid), which provides a hybrid file system space constructed on SSD and HDD. Although the promising potentials of SSD, such as the absence of mechanical moving overhead and high random I/O throughput, have drawn a lot of attentions from IT enterprises, its high ratio of cost/capacity makes it less desirable to build a large-scale data storage subsystem composed of only SSDs. In this paper, we present N-hybrid that attempts to integrate the strengths of SSD and HDD, to offer a single, large hybrid file system space. Several experiments were conducted to verify the performance of N-hybrid.

Keywords: SSD, data section, I/O optimizations, hybrid system

Procedia PDF Downloads 396
11965 Parallelizing the Hybrid Pseudo-Spectral Time Domain/Finite Difference Time Domain Algorithms for the Large-Scale Electromagnetic Simulations Using Massage Passing Interface Library

Authors: Donggun Lee, Q-Han Park

Abstract:

Due to its coarse grid, the Pseudo-Spectral Time Domain (PSTD) method has advantages against the Finite Difference Time Domain (FDTD) method in terms of memory requirement and operation time. However, since the efficiency of parallelization is much lower than that of FDTD, PSTD is not a useful method for a large-scale electromagnetic simulation in a parallel platform. In this paper, we propose the parallelization technique of the hybrid PSTD-FDTD (HPF) method which simultaneously possesses the efficient parallelizability of FDTD and the quick speed and low memory requirement of PSTD. Parallelization cost of the HPF method is exactly the same as the parallel FDTD, but still, it occupies much less memory space and has faster operation speed than the parallel FDTD. Experiments in distributed memory systems have shown that the parallel HPF method saves up to 96% of the operation time and reduces 84% of the memory requirement. Also, by combining the OpenMP library to the MPI library, we further reduced the operation time of the parallel HPF method by 50%.

Keywords: FDTD, hybrid, MPI, OpenMP, PSTD, parallelization

Procedia PDF Downloads 119
11964 Earth Tremors in Nigeria: A Precursor to Major Disaster?

Authors: Oluseyi Adunola Bamisaiye

Abstract:

The frequency of occurrence of earth tremor in Nigeria has increased tremendously in recent years. Slow earthquakes/ tremor have preceded some large earthquakes in some other regions of the world and the Nigerian case may not be an exception. Timely and careful investigation of these tremors may reveal their relation to large earthquakes and provides important clues to constrain the slip rates on tectonic faults. Thus making it imperative to keep under watch and also study carefully the tectonically active terrains within the country, in order to adequately forecast, prescribe mitigation measures and in order to avoid a major disaster. This report provides new evidence of a slow slip transient in a strongly locked seismogenic zone of the Okemesi fold belt. The aim of this research is to investigate the different methods of earth tremor monitoring using fault slip analysis and mapping of Okemesi hills, which has been the most recent epicenter to most of the recent tremors.

Keywords: earth tremor, fault slip, intraplate activities, plate tectonics

Procedia PDF Downloads 130
11963 Urban Noise and Air Quality: Correlation between Air and Noise Pollution; Sensors, Data Collection, Analysis and Mapping in Urban Planning

Authors: Massimiliano Condotta, Paolo Ruggeri, Chiara Scanagatta, Giovanni Borga

Abstract:

Architects and urban planners, when designing and renewing cities, have to face a complex set of problems, including the issues of noise and air pollution which are considered as hot topics (i.e., the Clean Air Act of London and the Soundscape definition). It is usually taken for granted that these problems go by together because the noise pollution present in cities is often linked to traffic and industries, and these produce air pollutants as well. Traffic congestion can create both noise pollution and air pollution, because NO₂ is mostly created from the oxidation of NO, and these two are notoriously produced by processes of combustion at high temperatures (i.e., car engines or thermal power stations). We can see the same process for industrial plants as well. What have to be investigated – and is the topic of this paper – is whether or not there really is a correlation between noise pollution and air pollution (taking into account NO₂) in urban areas. To evaluate if there is a correlation, some low-cost methodologies will be used. For noise measurements, the OpeNoise App will be installed on an Android phone. The smartphone will be positioned inside a waterproof box, to stay outdoor, with an external battery to allow it to collect data continuously. The box will have a small hole to install an external microphone, connected to the smartphone, which will be calibrated to collect the most accurate data. For air, pollution measurements will be used the AirMonitor device, an Arduino board to which the sensors, and all the other components, are plugged. After assembling the sensors, they will be coupled (one noise and one air sensor) and placed in different critical locations in the area of Mestre (Venice) to map the existing situation. The sensors will collect data for a fixed period of time to have an input for both week and weekend days, in this way it will be possible to see the changes of the situation during the week. The novelty is that data will be compared to check if there is a correlation between the two pollutants using graphs that should show the percentage of pollution instead of the values obtained with the sensors. To do so, the data will be converted to fit on a scale that goes up to 100% and will be shown thru a mapping of the measurement using GIS methods. Another relevant aspect is that this comparison can help to choose which are the right mitigation solutions to be applied in the area of the analysis because it will make it possible to solve both the noise and the air pollution problem making only one intervention. The mitigation solutions must consider not only the health aspect but also how to create a more livable space for citizens. The paper will describe in detail the methodology and the technical solution adopted for the realization of the sensors, the data collection, noise and pollution mapping and analysis.

Keywords: air quality, data analysis, data collection, NO₂, noise mapping, noise pollution, particulate matter

Procedia PDF Downloads 194
11962 The Discriminate Analysis and Relevant Model for Mapping Export Potential

Authors: Jana Gutierez Chvalkovska, Michal Mejstrik, Matej Urban

Abstract:

There are pending discussions over the mapping of country export potential in order to refocus export strategy of firms and its evidence-based promotion by the Export Credit Agencies (ECAs) and other permitted vehicles of governments. In this paper we develop our version of an applied model that offers “stepwise” elimination of unattractive markets. We modify and calibrate the model for the particular features of the Czech Republic and specific pilot cases where we apply an individual approach to each sector.

Keywords: export strategy, modeling export, calibration, export promotion

Procedia PDF Downloads 480
11961 Use of Landsat OLI Images in the Mapping of Landslides: Case of the Taounate Province in Northern Morocco

Authors: S. Benchelha, H. Chennaoui, M. Hakdaoui, L. Baidder, H. Mansouri, H. Ejjaaouani, T. Benchelha

Abstract:

Northern Morocco is characterized by relatively young mountains experiencing a very important dynamic compared to other areas of Morocco. The dynamics associated with the formation of the Rif chain (Alpine tectonics), is accompanied by instabilities essentially related to tectonic movements. The realization of important infrastructures (Roads, Highways,...) represents a triggering factor and favoring landslides. This paper is part of the establishment of landslides susceptibility map and concerns the mapping of unstable areas in the province of Taounate. The landslide was identified using the components of the false color (FCC) of images Landsat OLI: i) the first independent component (IC1), ii) The main component (PC), iii) Normalized difference index (NDI). This mapping for landslides class is validated by in-situ surveys.

Keywords: landslides, False Color Composite (FCC), Independent Component Analysis (ICA), Principal Component Analysis (PCA), Normalized Difference Index (NDI), Normalized Difference Mid Red Index (NDMIDR)

Procedia PDF Downloads 268
11960 Mapping Soils from Terrain Features: The Case of Nech SAR National Park of Ethiopia

Authors: Shetie Gatew

Abstract:

Current soil maps of Ethiopia do not represent accurately the soils of Nech Sar National Park. In the framework of studies on the ecology of the park, we prepared a soil map based on field observations and a digital terrain model derived from SRTM data with a 30-m resolution. The landscape comprises volcanic cones, lava and basalt outflows, undulating plains, horsts, alluvial plains and river deltas. SOTER-like terrain mapping units were identified. First, the DTM was classified into 128 terrain classes defined by slope gradient (4 classes), relief intensity (4 classes), potential drainage density (2 classes), and hypsometry (4 classes). A soil-landscape relation between the terrain mapping units and WRB soil units was established based on 34 soil profile pits. Based on this relation, the terrain mapping units were either merged or split to represent a comprehensive soil and terrain map. The soil map indicates that Leptosols (30 %), Cambisols (26%), Andosols (21%), Fluvisols (12 %), and Vertisols (9%) are the most widespread Reference Soil Groups of the park. In contrast, the harmonized soil map of Africa derived from the FAO soil map of the world indicates that Luvisols (70%), Vertisols (14%) and Fluvisols (16%) would be the most common Reference Soil Groups. However, these latter mapping units are not consistent with the topography, nor did we find such extensive areas occupied by Luvisols during the field survey. This case study shows that with the now freely available SRTM data, it is possible to improve current soil information layers with relatively limited resources, even in a complex terrain like Nech Sar National Park.

Keywords: andosols, cambisols, digital elevation model, leptosols, soil-landscaps relation

Procedia PDF Downloads 74
11959 Mapping the Intrinsic Vulnerability of the Quaternary Aquifer of the Eastern Mitidja (Northern Algeria)

Authors: Abida Haddouche, Ahmed Chrif Toubal

Abstract:

The Neogene basin of the Eastern Mitidja, object of the study area, represents potential water resources and especially groundwater reserves. This water is an important economic; this resource is highly sensitive which need protection and preservation. Unfortunately, these waters are exposed to various forms of pollution, whether from urban, agricultural, industrial or merely accidental. This pollution is a permanent risk of limiting resource. In this context, the work aims to evaluate the intrinsic vulnerability of the aquifer to protect and preserve the quality of this resource. It will focus on the disposal of water and land managers a cartographic document accessible to locate the areas where the water has a high vulnerability. Vulnerability mapping of the Easter Mitidja quaternary aquifer is performed by applying three methods (DRASTIC, DRIST, and GOD). Comparison and validation results show that the DRASTIC method is the most suitable method for aquifer vulnerability of the study area.

Keywords: Aquifer of Mitidja, DRASTIC method, geographic information system (GIS), vulnerability mapping

Procedia PDF Downloads 363
11958 A New Method to Winner Determination for Economic Resource Allocation in Cloud Computing Systems

Authors: Ebrahim Behrouzian Nejad, Rezvan Alipoor Sabzevari

Abstract:

Cloud computing systems are large-scale distributed systems, so that they focus more on large scale resource sharing, cooperation of several organizations and their use in new applications. One of the main challenges in this realm is resource allocation. There are many different ways to resource allocation in cloud computing. One of the common methods to resource allocation are economic methods. Among these methods, the auction-based method has greater prominence compared with Fixed-Price method. The double combinatorial auction is one of the proper ways of resource allocation in cloud computing. This method includes two phases: winner determination and resource allocation. In this paper a new method has been presented to determine winner in double combinatorial auction-based resource allocation using Imperialist Competitive Algorithm (ICA). The experimental results show that in our new proposed the number of winner users is higher than genetic algorithm. On other hand, in proposed algorithm, the number of winner providers is higher in genetic algorithm.

Keywords: cloud computing, resource allocation, double auction, winner determination

Procedia PDF Downloads 340
11957 The Laser Line Detection for Autonomous Mapping Based on Color Segmentation

Authors: Pavel Chmelar, Martin Dobrovolny

Abstract:

Laser projection or laser footprint detection is today widely used in many fields of robotics, measurement, or electronics. The system accuracy strictly depends on precise laser footprint detection on target objects. This article deals with the laser line detection based on the RGB segmentation and the component labeling. As a measurement device was used the developed optical rangefinder. The optical rangefinder is equipped with vertical sweeping of the laser beam and high quality camera. This system was developed mainly for automatic exploration and mapping of unknown spaces. In the first section is presented a new detection algorithm. In the second section are presented measurements results. The measurements were performed in variable light conditions in interiors. The last part of the article present achieved results and their differences between day and night measurements.

Keywords: color segmentation, component labelling, laser line detection, automatic mapping, distance measurement, vector map

Procedia PDF Downloads 409
11956 A Modeling Approach for Blockchain-Oriented Information Systems Design

Authors: Jiaqi Yan, Yani Shi

Abstract:

The blockchain technology is regarded as the most promising technology that has the potential to trigger a technological revolution. However, besides the bitcoin industry, we have not yet seen a large-scale application of blockchain in those domains that are supposed to be impacted, such as supply chain, financial network, and intelligent manufacturing. The reasons not only lie in the difficulties of blockchain implementation, but are also root in the challenges of blockchain-oriented information systems design. As the blockchain members are self-interest actors that belong to organizations with different existing information systems. As they expect different information inputs and outputs of the blockchain application, a common language protocol is needed to facilitate communications between blockchain members. Second, considering the decentralization of blockchain organization, there is not any central authority to organize and coordinate the business processes. Thus, the information systems built on blockchain should support more adaptive business process. This paper aims to address these difficulties by providing a modeling approach for blockchain-oriented information systems design. We will investigate the information structure of distributed-ledger data with conceptual modeling techniques and ontology theories, and build an effective ontology mapping method for the inter-organization information flow and blockchain information records. Further, we will study the distributed-ledger-ontology based business process modeling to support adaptive enterprise on blockchain.

Keywords: blockchain, ontology, information systems modeling, business process

Procedia PDF Downloads 410
11955 Developing New Media Credibility Scale: A Multidimensional Perspective

Authors: Hanaa Farouk Saleh

Abstract:

The main purposes of this study are to develop a scale that reflects emerging theoretical understandings of new media credibility, based on the evolution of credibility studies in western researches, identification of the determinants of credibility in the media and its components by comparing traditional and new media credibility scales and building accumulative scale to test new media credibility. This approach was built on western researches using conceptualizations of media credibility, which focuses on four principal components: Source (journalist), message (article), medium (newspaper, radio, TV, web, etc.), and organization (owner of the medium), and adding user and cultural context as key components to assess new media credibility in particular. This study’s value lies in its contribution to the conceptualization and development of new media credibility through the creation of a theoretical measurement tool. Future studies should explore this scale to test new media credibility, which represents a promising new approach in the efforts to define and measure credibility of all media types.

Keywords: credibility scale, media credibility components, new media credibility scale, scale development

Procedia PDF Downloads 296
11954 Scalable UI Test Automation for Large-scale Web Applications

Authors: Kuniaki Kudo, Raviraj Solanki, Kaushal Patel, Yash Virani

Abstract:

This research mainly concerns optimizing UI test automation for large-scale web applications. The test target application is the HHAexchange homecare management WEB application that seamlessly connects providers, state Medicaid programs, managed care organizations (MCOs), and caregivers through one platform with large-scale functionalities. This study focuses on user interface automation testing for the WEB application. The quality assurance team must execute many manual users interface test cases in the development process to confirm no regression bugs. The team automated 346 test cases; the UI automation test execution time was over 17 hours. The business requirement was reducing the execution time to release high-quality products quickly, and the quality assurance automation team modernized the test automation framework to optimize the execution time. The base of the WEB UI automation test environment is Selenium, and the test code is written in Python. Adopting a compilation language to write test code leads to an inefficient flow when introducing scalability into a traditional test automation environment. In order to efficiently introduce scalability into Test Automation, a scripting language was adopted. The scalability implementation is mainly implemented with AWS's serverless technology, an elastic container service. The definition of scalability here is the ability to automatically set up computers to test automation and increase or decrease the number of computers running those tests. This means the scalable mechanism can help test cases run parallelly. Then test execution time is dramatically decreased. Also, introducing scalable test automation is for more than just reducing test execution time. There is a possibility that some challenging bugs are detected by introducing scalable test automation, such as race conditions, Etc. since test cases can be executed at same timing. If API and Unit tests are implemented, the test strategies can be adopted more efficiently for this scalability testing. However, in WEB applications, as a practical matter, API and Unit testing cannot cover 100% functional testing since they do not reach front-end codes. This study applied a scalable UI automation testing strategy to the large-scale homecare management system. It confirmed the optimization of the test case execution time and the detection of a challenging bug. This study first describes the detailed architecture of the scalable test automation environment, then describes the actual performance reduction time and an example of challenging issue detection.

Keywords: aws, elastic container service, scalability, serverless, ui automation test

Procedia PDF Downloads 72
11953 Inkjet Printed Silver Nanowire Network as Semi-Transparent Electrode for Organic Photovoltaic Devices

Authors: Donia Fredj, Marie Parmentier, Florence Archet, Olivier Margeat, Sadok Ben Dkhil, Jorg Ackerman

Abstract:

Transparent conductive electrodes (TCEs) or transparent electrodes (TEs) are a crucial part of many electronic and optoelectronic devices such as touch panels, liquid crystal displays (LCDs), organic light-emitting diodes (OLEDs), solar cells, and transparent heaters. The indium tin oxide (ITO) electrode is the most widely utilized transparent electrode due to its excellent optoelectrical properties. However, the drawbacks of ITO, such as the high cost of this material, scarcity of indium, and the fragile nature, limit the application in large-scale flexible electronic devices. Importantly, flexibility is becoming more and more attractive since flexible electrodes have the potential to open new applications which require transparent electrodes to be flexible, cheap, and compatible with large-scale manufacturing methods. So far, several materials as alternatives to ITO have been developed, including metal nanowires, conjugated polymers, carbon nanotubes, graphene, etc., which have been extensively investigated for use as flexible and low-cost electrodes. Among them, silver nanowires (AgNW) are one of the promising alternatives to ITO thanks to their excellent properties, high electrical conductivity as well as desirable light transmittance. In recent years, inkjet printing became a promising technique for large-scale printed flexible and stretchable electronics. However, inkjet printing of AgNWs still presents many challenges. In this study, a synthesis of stable AgNW that could compete with ITO was developed. This material was printed by inkjet technology directly on a flexible substrate. Additionally, we analyzed the surface microstructure, optical and electrical properties of the printed AgNW layers. Our further research focused on the study of all inkjet-printed organic modules with high efficiency.

Keywords: transparent electrodes, silver nanowires, inkjet printing, formulation of stable inks

Procedia PDF Downloads 196
11952 Investigation of the Material Behaviour of Polymeric Interlayers in Broken Laminated Glass

Authors: Martin Botz, Michael Kraus, Geralt Siebert

Abstract:

The use of laminated glass gains increasing importance in structural engineering. For safety reasons, at least two glass panes are laminated together with a polymeric interlayer. In case of breakage of one or all of the glass panes, the glass fragments are still connected to the interlayer due to adhesion forces and a certain residual load-bearing capacity is left in the system. Polymer interlayers used in the laminated glass show a viscoelastic material behavior, e.g. stresses and strains in the interlayer are dependent on load duration and temperature. In the intact stage only small strains appear in the interlayer, thus the material can be described in a linear way. In the broken stage, large strains can appear and a non-linear viscoelasticity material theory is necessary. Relaxation tests on two different types of polymeric interlayers are performed at different temperatures and strain amplitudes to determine the border to the non-linear material regime. Based on the small-scale specimen results further tests on broken laminated glass panes are conducted. So-called ‘through-crack-bending’ (TCB) tests are performed, in which the laminated glass has a defined crack pattern. The test set-up is realized in a way that one glass layer is still able to transfer compressive stresses but tensile stresses have to be transferred by the interlayer solely. The TCB-tests are also conducted under different temperatures but constant force (creep test). Aims of these experiments are to elaborate if the results of small-scale tests on the interlayer are transferable to a laminated glass system in the broken stage. In this study, limits of the applicability of linear-viscoelasticity are established in the context of two commercially available polymer-interlayers. Furthermore, it is shown that the results of small-scale tests agree to a certain degree to the results of the TCB large-scale experiments. In a future step, the results can be used to develop material models for the post breakage performance of laminated glass.

Keywords: glass breakage, laminated glass, relaxation test, viscoelasticity

Procedia PDF Downloads 104
11951 Public Private Partnership for Infrastructure Projects: Mapping the Key Risks

Authors: Julinda Keçi

Abstract:

In many countries, governments have been promoting the involvement of private sector entities to enter into long-term agreements for the development and delivery of large infrastructure projects, with a focus on overcoming the limitations upon public fund of the traditional approach. The involvement of private sector through public-private partnerships (PPP) brings in new capital investments, value for money and additional risks to handle. Worldwide research studies have shown that an objective, systematic, reliable and user-oriented risk assessment process and an optimal allocation mechanism among different stakeholders is crucial to the successful completion. In this framework this paper, which is the first stage of a research study, aims to identify the main risks for the delivery of PPP projects. A review of cross-countries research projects and case studies was performed to map the key risks affecting PPP infrastructure delivery. The matrix of mapping offers a summary of the frequency of factors, clustered in eleven categories: Construction, Design, Economic, Legal, Market, Natural, Operation, Political, Project finance, Project selection and Relationship. Results will highlight the most critical risk factors, and will hopefully assist the project managers in directing the managerial attention in the further stages of risk allocation.

Keywords: construction, infrastructure, public private partnerships, risks

Procedia PDF Downloads 415
11950 Geological Mapping of Gabel Humr Akarim Area, Southern Eastern Desert, Egypt: Constrain from Remote Sensing Data, Petrographic Description and Field Investigation

Authors: Doaa Hamdi, Ahmed Hashem

Abstract:

The present study aims at integrating the ASTER data and Landsat 8 data to discriminate and map alteration and/or mineralization zones in addition to delineating different lithological units of Humr Akarim Granites area. The study area is located at 24º9' to 24º13' N and 34º1' to 34º2'45"E., covering a total exposed surface area of about 17 km². The area is characterized by rugged topography with low to moderate relief. Geologic fieldwork and petrographic investigations revealed that the basement complex of the study area is composed of metasediments, mafic dikes, older granitoids, and alkali-feldspar granites. Petrographic investigations revealed that the secondary minerals in the study area are mainly represented by chlorite, epidote, clay minerals and iron oxides. These minerals have specific spectral signatures in the region of visible near-infrared and short-wave infrared (0.4 to 2.5 µm). So that the ASTER imagery processing was concentrated on VNIR-SWIR spectrometric data in order to achieve the purposes of this study (geologic mapping of hydrothermal alteration zones and delineate possible radioactive potentialities). Mapping of hydrothermal alterations zones in addition to discriminating the lithological units in the study area are achieved through the utilization of some different image processing, including color band composites (CBC) and data transformation techniques such as band ratios (BR), band ratio codes (BRCs), principal component analysis(PCA), Crosta Technique and minimum noise fraction (MNF). The field verification and petrographic investigation confirm the results of ASTER imagery and Landsat 8 data, proposing a geological map (scale 1:50000).

Keywords: remote sensing, petrography, mineralization, alteration detection

Procedia PDF Downloads 140
11949 Robotics and Embedded Systems Applied to the Buried Pipeline Inspection

Authors: Robson C. Santos, Julio C. P. Ribeiro, Iorran M. de Castro, Luan C. F. Rodrigues, Sandro R. L. Silva, Diego M. Quesada

Abstract:

The work aims to develop a robot in the form of autonomous vehicle to detect, inspection and mapping of underground pipelines through the ATmega328 Arduino platform. Hardware prototyping very similar to C / C ++ language that facilitates its use in robotics open source, resembles PLC used in large industrial processes. The robot will traverse the surface independently of direct human action, in order to automate the process of detecting buried pipes, guided by electromagnetic induction. The induction comes from coils that sends the signal to the Arduino microcontroller contained in that will make the difference in intensity and the treatment of the information, then this determines actions to electrical components such as relays and motors, allowing the prototype to move on the surface and getting the necessary information. The robot was developed by electrical and electronic assemblies that allowed test your application. The assembly is made up of metal detector coils, circuit boards and microprocessor, which interconnected circuits previously developed can determine, process control and mechanical actions for a robot (autonomous car) that will make the detection and mapping of buried pipelines plates.

Keywords: robotic, metal detector, embedded system, pipeline inspection

Procedia PDF Downloads 595
11948 Robot Operating System-Based SLAM for a Gazebo-Simulated Turtlebot2 in 2d Indoor Environment with Cartographer Algorithm

Authors: Wilayat Ali, Li Sheng, Waleed Ahmed

Abstract:

The ability of the robot to make simultaneously map of the environment and localize itself with respect to that environment is the most important element of mobile robots. To solve SLAM many algorithms could be utilized to build up the SLAM process and SLAM is a developing area in Robotics research. Robot Operating System (ROS) is one of the frameworks which provide multiple algorithm nodes to work with and provide a transmission layer to robots. Manyof these algorithms extensively in use are Hector SLAM, Gmapping and Cartographer SLAM. This paper describes a ROS-based Simultaneous localization and mapping (SLAM) library Google Cartographer mapping, which is open-source algorithm. The algorithm was applied to create a map using laser and pose data from 2d Lidar that was placed on a mobile robot. The model robot uses the gazebo package and simulated in Rviz. Our research work's primary goal is to obtain mapping through Cartographer SLAM algorithm in a static indoor environment. From our research, it is shown that for indoor environments cartographer is an applicable algorithm to generate 2d maps with LIDAR placed on mobile robot because it uses both odometry and poses estimation. The algorithm has been evaluated and maps are constructed against the SLAM algorithms presented by Turtlebot2 in the static indoor environment.

Keywords: SLAM, ROS, navigation, localization and mapping, gazebo, Rviz, Turtlebot2, slam algorithms, 2d indoor environment, cartographer

Procedia PDF Downloads 127
11947 Addressing the Gap in Health and Wellbeing Evidence for Urban Real Estate Brownfield Asset Management Social Needs and Impact Analysis Using Systems Mapping Approach

Authors: Kathy Pain, Nalumino Akakandelwa

Abstract:

The study explores the potential to fill a gap in health and wellbeing evidence for purposeful urban real estate asset management to make investment a powerful force for societal good. Part of a five-year programme investigating the root causes of unhealthy urban development funded by the United Kingdom Prevention Research Partnership (UKPRP), the study pilots the use of a systems mapping approach to identify drivers and barriers to the incorporation of health and wellbeing evidence in urban brownfield asset management decision-making. Urban real estate not only provides space for economic production but also contributes to the quality of life in the local community. Yet market approaches to urban land use have, until recently, insisted that neo-classical technology-driven efficient allocation of economic resources should inform acquisition, operational, and disposal decisions. Buildings in locations with declining economic performance have thus been abandoned, leading to urban decay. Property investors are recognising the inextricable connection between sustainable urban production and quality of life in local communities. The redevelopment and operation of brownfield assets recycle existing buildings, minimising embodied carbon emissions. It also retains established urban spaces with which local communities identify and regenerate places to create a sense of security, economic opportunity, social interaction, and quality of life. Social implications of urban real estate on health and wellbeing and increased adoption of benign sustainability guidance in urban production are driving the need to consider how they affect brownfield real estate asset management decisions. Interviews with real estate upstream decision-makers in the study, find that local social needs and impact analysis is becoming a commercial priority for large-scale urban real estate development projects. Evidence of the social value-added of proposed developments is increasingly considered essential to secure local community support and planning permissions, and to attract sustained inward long-term investment capital flows for urban projects. However, little is known about the contribution of population health and wellbeing to socially sustainable urban projects and the monetary value of the opportunity this presents to improve the urban environment for local communities. We report early findings from collaborations with two leading property companies managing major investments in brownfield urban assets in the UK to consider how the inclusion of health and wellbeing evidence in social valuation can inform perceptions of brownfield development social benefit for asset managers, local communities, public authorities and investors for the benefit of all parties. Using holistic case studies and systems mapping approaches, we explore complex relationships between public health considerations and asset management decisions in urban production. Findings indicate a strong real estate investment industry appetite and potential to include health as a vital component of sustainable real estate social value creation in asset management strategies.

Keywords: brownfield urban assets, health and wellbeing, social needs and impact, social valuation, sustainable real estate, systems mapping

Procedia PDF Downloads 45