Search results for: Fully spatial signal processing
781 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas
Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards
Abstract:
Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.
Keywords: Airborne laser scanning, digital terrain models, filtering, forested areas.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 718780 Comparison of different Channel Modeling Techniques used in the BPLC Systems
Authors: Justinian Anatory, Nelson Theethayi
Abstract:
The paper compares different channel models used for modeling Broadband Power-Line Communication (BPLC) system. The models compared are Zimmermann and Dostert, Philipps, Anatory et al and Anatory et al generalized Transmission Line (TL) model. The validity of each model was compared in time domain with ATP-EMTP software which uses transmission line approach. It is found that for a power-line network with minimum number of branches all the models give similar signal/pulse time responses compared with ATP-EMTP software; however, Zimmermann and Dostert model indicates the same amplitude but different time delay. It is observed that when the numbers of branches are increased only generalized TL theory approach results are comparable with ATPEMTP results. Also the Multi-Carrier Spread Spectrum (MC-SS) system was applied to check the implication of such behavior on the modulation schemes. It is observed that using Philipps on the underground cable can predict the performance up to 25dB better than other channel models which can misread the actual performance of the system. Also modified Zimmermann and Dostert under multipath can predict a better performance of about 5dB better than the actual predicted by Generalized TL theory. It is therefore suggested for a realistic BPLC system design and analyses the model based on generalized TL theory be used.Keywords: Broadband Power line Channel Models, loadimpedance, Branched network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1828779 Air Dispersion Model for Prediction Fugitive Landfill Gaseous Emission Impact in Ambient Atmosphere
Authors: Moustafa Osman Mohammed
Abstract:
This paper will explore formation of HCl aerosol at atmospheric boundary layers and encourages the uptake of environmental modeling systems (EMSs) as a practice evaluation of gaseous emissions (“framework measures”) from small and medium-sized enterprises (SMEs). The conceptual model predicts greenhouse gas emissions to ecological points beyond landfill site operations. It focuses on incorporation traditional knowledge into baseline information for both measurement data and the mathematical results, regarding parameters influence model variable inputs. The paper has simplified parameters of aerosol processes based on the more complex aerosol process computations. The simple model can be implemented to both Gaussian and Eulerian rural dispersion models. Aerosol processes considered in this study were (i) the coagulation of particles, (ii) the condensation and evaporation of organic vapors, and (iii) dry deposition. The chemical transformation of gas-phase compounds is taken into account photochemical formulation with exposure effects according to HCl concentrations as starting point of risk assessment. The discussion set out distinctly aspect of sustainability in reflection inputs, outputs, and modes of impact on the environment. Thereby, models incorporate abiotic and biotic species to broaden the scope of integration for both quantification impact and assessment risks. The later environmental obligations suggest either a recommendation or a decision of what is a legislative should be achieved for mitigation measures of landfill gas (LFG) ultimately.Keywords: Air dispersion model, landfill management, spatial analysis, environmental impact and risk assessment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1558778 Parallel Double Splicing on Iso-Arrays
Authors: V. Masilamani, D.K. Sheena Christy, D.G. Thomas
Abstract:
Image synthesis is an important area in image processing. To synthesize images various systems are proposed in the literature. In this paper, we propose a bio-inspired system to synthesize image and to study the generating power of the system, we define the class of languages generated by our system. We call image as array in this paper. We use a primitive called iso-array to synthesize image/array. The operation is double splicing on iso-arrays. The double splicing operation is used in DNA computing and we use this to synthesize image. A comparison of the family of languages generated by the proposed self restricted double splicing systems on iso-arrays with the existing family of local iso-picture languages is made. Certain closure properties such as union, concatenation and rotation are studied for the family of languages generated by the proposed model.Keywords: DNA computing, splicing system, iso-picture languages, iso-array double splicing system, iso-array self splicing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1545777 Edge Detection Using Multi-Agent System: Evaluation on Synthetic and Medical MR Images
Authors: A. Nachour, L. Ouzizi, Y. Aoura
Abstract:
Recent developments on multi-agent system have brought a new research field on image processing. Several algorithms are used simultaneously and improved in deferent applications while new methods are investigated. This paper presents a new automatic method for edge detection using several agents and many different actions. The proposed multi-agent system is based on parallel agents that locally perceive their environment, that is to say, pixels and additional environmental information. This environment is built using Vector Field Convolution that attract free agent to the edges. Problems of partial, hidden or edges linking are solved with the cooperation between agents. The presented method was implemented and evaluated using several examples on different synthetic and medical images. The obtained experimental results suggest that this approach confirm the efficiency and accuracy of detected edge.
Keywords: Edge detection, medical MR images, multi-agent systems, vector field convolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1904776 Tree-on-DAG for Data Aggregation in Sensor Networks
Authors: Prakash G L, Thejaswini M, S H Manjula, K R Venugopal, L M Patnaik
Abstract:
Computing and maintaining network structures for efficient data aggregation incurs high overhead for dynamic events where the set of nodes sensing an event changes with time. Moreover, structured approaches are sensitive to the waiting time that is used by nodes to wait for packets from their children before forwarding the packet to the sink. An optimal routing and data aggregation scheme for wireless sensor networks is proposed in this paper. We propose Tree on DAG (ToD), a semistructured approach that uses Dynamic Forwarding on an implicitly constructed structure composed of multiple shortest path trees to support network scalability. The key principle behind ToD is that adjacent nodes in a graph will have low stretch in one of these trees in ToD, thus resulting in early aggregation of packets. Based on simulations on a 2,000-node Mica2- based network, we conclude that efficient aggregation in large-scale networks can be achieved by our semistructured approach.Keywords: Aggregation, Packet Merging, Query Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931775 Social Interaction Dynamics Exploration: The Case Study of El Sherouk City
Authors: Nardine El Bardisy, Wolf Reuter, Ayat Ismail
Abstract:
In Egypt, there is continuous housing demand as a result of rapid population growth. In 1979, this forced the government to establish new urban communities in order to decrease stress around delta. New Urban Communities Authority (NUCA) was formulated to take the responsibly of this new policy. These communities suffer from social life deficiency due to their typology, which is separated island with barriers. New urban communities’ typology results from the influence of neoliberalism movement and modern city planning forms. The lack of social interaction in these communities at present should be enhanced in the future. On a global perspective, sustainable development calls for creating more sustainable communities which include social, economic and environmental aspects. From 1960, planners were highly focusing on the promotion of the social dimension in urban development plans. The research hypothesis states: “It is possible to promote social interaction in new urban communities through a set of socio-spatial recommended strategies that are tailored for Greater Cairo Region context”. In order to test this hypothesis, the case of El-Sherouk city is selected, which represents the typical NUCA development plans. Social interaction indicators were derived from literature and used to explore different social dynamics in the selected case. The tools used for exploring case study are online questionnaires, face to face questionnaires, interviews, and observations. These investigations were analyzed, conclusions and recommendations were set to improve social interaction.
Keywords: New urban communities, modern planning, social Interaction, Social life.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 762774 Design of an Intelligent Location Identification Scheme Based On LANDMARC and BPNs
Authors: S. Chaisit, H.Y. Kung, N.T. Phuong
Abstract:
Radio frequency identification (RFID) applications have grown rapidly in many industries, especially in indoor location identification. The advantage of using received signal strength indicator (RSSI) values as an indoor location measurement method is a cost-effective approach without installing extra hardware. Because the accuracy of many positioning schemes using RSSI values is limited by interference factors and the environment, thus it is challenging to use RFID location techniques based on integrating positioning algorithm design. This study proposes the location estimation approach and analyzes a scheme relying on RSSI values to minimize location errors. In addition, this paper examines different factors that affect location accuracy by integrating the backpropagation neural network (BPN) with the LANDMARC algorithm in a training phase and an online phase. First, the training phase computes coordinates obtained from the LANDMARC algorithm, which uses RSSI values and the real coordinates of reference tags as training data for constructing an appropriate BPN architecture and training length. Second, in the online phase, the LANDMARC algorithm calculates the coordinates of tracking tags, which are then used as BPN inputs to obtain location estimates. The results show that the proposed scheme can estimate locations more accurately compared to LANDMARC without extra devices.
Keywords: BPNs, indoor location, location estimation, intelligent location identification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2011773 Sub-Image Detection Using Fast Neural Processors and Image Decomposition
Authors: Hazem M. El-Bakry, Qiangfu Zhao
Abstract:
In this paper, an approach to reduce the computation steps required by fast neural networksfor the searching process is presented. The principle ofdivide and conquer strategy is applied through imagedecomposition. Each image is divided into small in sizesub-images and then each one is tested separately usinga fast neural network. The operation of fast neuralnetworks based on applying cross correlation in thefrequency domain between the input image and theweights of the hidden neurons. Compared toconventional and fast neural networks, experimentalresults show that a speed up ratio is achieved whenapplying this technique to locate human facesautomatically in cluttered scenes. Furthermore, fasterface detection is obtained by using parallel processingtechniques to test the resulting sub-images at the sametime using the same number of fast neural networks. Incontrast to using only fast neural networks, the speed upratio is increased with the size of the input image whenusing fast neural networks and image decomposition.
Keywords: Fast Neural Networks, 2D-FFT, CrossCorrelation, Image decomposition, Parallel Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2179772 Wireless Distributed Load-Shedding Management System for Non-Emergency Cases
Authors: Taha Landolsi, A. R. Al-Ali, Tarik Ozkul, Mohammad A. Al-Rousan
Abstract:
In this paper, we present a cost-effective wireless distributed load shedding system for non-emergency scenarios. In power transformer locations where SCADA system cannot be used, the proposed solution provides a reasonable alternative that combines the use of microcontrollers and existing GSM infrastructure to send early warning SMS messages to users advising them to proactively reduce their power consumption before system capacity is reached and systematic power shutdown takes place. A novel communication protocol and message set have been devised to handle the messaging between the transformer sites, where the microcontrollers are located and where the measurements take place, and the central processing site where the database server is hosted. Moreover, the system sends warning messages to the endusers mobile devices that are used as communication terminals. The system has been implemented and tested via different experimental results.Keywords: Smart Grid, Load shedding, Demand SideManagement, GSM Wireless Networks, SCADA systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2628771 Laboratory Scale Extraction of Sugar Cane using High Electric Field Pulses
Authors: M. N. Eshtiaghi, N. Yoswathana
Abstract:
The aim of this study was to extract sugar from sugarcane using high electric field pulse (HELP) as a non-thermal cell permeabilization method. The result of this study showed that it is possible to permeablize sugar cane cells using HELP at very short times (less than 10 sec.) and at room temperature. Increasing the field strength (from 0.5kV/cm to 2kV/cm) and pulse number (1 to 12) led to increasing the permeabilization of sugar cane cells. The energy consumption during HELP treatment of sugar cane (2.4 kJ/kg) was about 100 times less compared to thermal cell disintegration at 85 <=C (about 271.7 kJ/kg). In addition, it was possible to extract sugar cane at a moderate temperature (45 <=C) using HELP pretreatment. With combination of HELP pretreatment followed by thermal extraction at 75 <=C, extraction resulted in up to 3% more sugar (on the basis of total extractable sugar) compared to samples without HELP pretreatment.Keywords: Cell permeabilization, High electric field pulses, Non-thermal processing, Sugar cane extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2748770 Analytical and Finite Element Analysis of Hydroforming Deep Drawing Process
Authors: Maziar Ramezani, Thomas Neitzert
Abstract:
This paper gives an overview of a deep drawing process by pressurized liquid medium separated from the sheet by a rubber diaphragm. Hydroforming deep drawing processing of sheet metal parts provides a number of advantages over conventional techniques. It generally increases the depth to diameter ratio possible in cup drawing and minimizes the thickness variation of the drawn cup. To explore the deformation mechanism, analytical and numerical simulations are used for analyzing the drawing process of an AA6061-T4 blank. The effects of key process parameters such as coefficient of friction, initial thickness of the blank and radius between cup wall and flange are investigated analytically and numerically. The simulated results were in good agreement with the results of the analytical model. According to finite element simulations, the hydroforming deep drawing method provides a more uniform thickness distribution compared to conventional deep drawing and decreases the risk of tearing during the process.Keywords: Deep drawing, Hydroforming, Rubber diaphragm
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2908769 Fabrication and Characterization of Gelatin Nanofibers Dissolved in Concentrated Acetic Acid
Authors: Kooshina Koosha, Sima Habibi, Azam Talebian
Abstract:
Electrospinning is a simple, versatile and widely accepted technique to produce ultra-fine fibers ranging from nanometer to micron. Recently there has been great interest in developing this technique to produce nanofibers with novel properties and functionalities. The electrospinning field is extremely broad, and consequently there have been many useful reviews discussing various aspects from detailed fiber formation mechanism to the formation of nanofibers and to discussion on a wide range of applications. On the other hand, the focus of this study is quite narrow, highlighting electrospinning parameters. This work will briefly cover the solution and processing parameters (for instance; concentration, solvent type, voltage, flow rate, distance between the collector and the tip of the needle) impacting the morphological characteristics of nanofibers, such as diameter. In this paper, a comprehensive work would be presented on the research of producing nanofibers from natural polymer entitled Gelatin.
Keywords: Electro spinning, solution parameters, process parameters, natural fiber.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1348768 Native Language Identification with Cross-Corpus Evaluation Using Social Media Data: 'Reddit'
Authors: Yasmeen Bassas, Sandra Kuebler, Allen Riddell
Abstract:
Native Language Identification is one of the growing subfields in Natural Language Processing (NLP). The task of Native Language Identification (NLI) is mainly concerned with predicting the native language of an author’s writing in a second language. In this paper, we investigate the performance of two types of features; content-based features vs. content independent features when they are evaluated on a different corpus (using social media data “Reddit”). In this NLI task, the predefined models are trained on one corpus (TOEFL) and then the trained models are evaluated on a different data using an external corpus (Reddit). Three classifiers are used in this task; the baseline, linear SVM, and Logistic Regression. Results show that content-based features are more accurate and robust than content independent ones when tested within corpus and across corpus.
Keywords: NLI, NLP, content-based features, content independent features, social media corpus, ML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 415767 Impact of Vehicle Travel Characteristics on Level of Service: A Comparative Analysis of Rural and Urban Freeways
Authors: Anwaar Ahmed, Muhammad Bilal Khurshid, Samuel Labi
Abstract:
The effect of trucks on the level of service is determined by considering passenger car equivalents (PCE) of trucks. The current version of Highway Capacity Manual (HCM) uses a single PCE value for all tucks combined. However, the composition of truck traffic varies from location to location; therefore, a single PCE value for all trucks may not correctly represent the impact of truck traffic at specific locations. Consequently, present study developed separate PCE values for single-unit and combination trucks to replace the single value provided in the HCM on different freeways. Site specific PCE values, were developed using concept of spatial lagging headways (that is the distance between rear bumpers of two vehicles in a traffic stream) measured from field traffic data. The study used data from four locations on a single urban freeway and three different rural freeways in Indiana. Three-stage-leastsquares (3SLS) regression techniques were used to generate models that predicted lagging headways for passenger cars, single unit trucks (SUT), and combination trucks (CT). The estimated PCE values for single-unit and combination truck for basic urban freeways (level terrain) were: 1.35 and 1.60, respectively. For rural freeways the estimated PCE values for single-unit and combination truck were: 1.30 and 1.45, respectively. As expected, traffic variables such as vehicle flow rates and speed have significant impacts on vehicle headways. Study results revealed that the use of separate PCE values for different truck classes can have significant influence on the LOS estimation.
Keywords: Level of Service, Capacity Analysis, Lagging Headway.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2096766 Analyzing the Impact of Spatio-Temporal Climate Variations on the Rice Crop Calendar in Pakistan
Authors: Muhammad Imran, Iqra Basit, Mobushir Riaz Khan, Sajid Rasheed Ahmad
Abstract:
The present study investigates the space-time impact of climate change on the rice crop calendar in tropical Gujranwala, Pakistan. The climate change impact was quantified through the climatic variables, whereas the existing calendar of the rice crop was compared with the phonological stages of the crop, depicted through the time series of the Normalized Difference Vegetation Index (NDVI) derived from Landsat data for the decade 2005-2015. Local maxima were applied on the time series of NDVI to compute the rice phonological stages. Panel models with fixed and cross-section fixed effects were used to establish the relation between the climatic parameters and the time-series of NDVI across villages and across rice growing periods. Results show that the climatic parameters have significant impact on the rice crop calendar. Moreover, the fixed effect model is a significant improvement over cross-sectional fixed effect models (R-squared equal to 0.673 vs. 0.0338). We conclude that high inter-annual variability of climatic variables cause high variability of NDVI, and thus, a shift in the rice crop calendar. Moreover, inter-annual (temporal) variability of the rice crop calendar is high compared to the inter-village (spatial) variability. We suggest the local rice farmers to adapt this change in the rice crop calendar.
Keywords: Landsat NDVI, panel models, temperature, rainfall.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 911765 Behavioral Signature Generation using Shadow Honeypot
Authors: Maros Barabas, Michal Drozd, Petr Hanacek
Abstract:
A novel behavioral detection framework is proposed to detect zero day buffer overflow vulnerabilities (based on network behavioral signatures) using zero-day exploits, instead of the signature-based or anomaly-based detection solutions currently available for IDPS techniques. At first we present the detection model that uses shadow honeypot. Our system is used for the online processing of network attacks and generating a behavior detection profile. The detection profile represents the dataset of 112 types of metrics describing the exact behavior of malware in the network. In this paper we present the examples of generating behavioral signatures for two attacks – a buffer overflow exploit on FTP server and well known Conficker worm. We demonstrated the visualization of important aspects by showing the differences between valid behavior and the attacks. Based on these metrics we can detect attacks with a very high probability of success, the process of detection is however very expensive.Keywords: behavioral signatures, metrics, network, security design
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2054764 Design and Implementation of Shared Memory based Parallel File System Logging Method for High Performance Computing
Authors: Hyeyoung Cho, Sungho Kim, SangDong Lee
Abstract:
I/O workload is a critical and important factor to analyze I/O pattern and file system performance. However tracing I/O operations on the fly distributed parallel file system is non-trivial due to collection overhead and a large volume of data. In this paper, we design and implement a parallel file system logging method for high performance computing using shared memory-based multi-layer scheme. It minimizes the overhead with reduced logging operation response time and provides efficient post-processing scheme through shared memory. Separated logging server can collect sequential logs from multiple clients in a cluster through packet communication. Implementation and evaluation result shows low overhead and high scalability of this architecture for high performance parallel logging analysis.Keywords: I/O workload, PVFS, I/O Trace.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1560763 Cybersecurity for Digital Twins in the Built Environment: Research Landscape, Industry Attitudes and Future Direction
Authors: Kaznah Alshammari, Thomas Beach, Yacine Rezgui
Abstract:
Technological advances in the construction sector are helping to make smart cities a reality by means of Cyber-Physical Systems (CPS). CPS integrate information and the physical world through the use of Information Communication Technologies (ICT). An increasingly common goal in the built environment is to integrate Building Information Models (BIM) with Internet of Things (IoT) and sensor technologies using CPS. Future advances could see the adoption of digital twins, creating new opportunities for CPS using monitoring, simulation and optimisation technologies. However, researchers often fail to fully consider the security implications. To date, it is not widely possible to assimilate BIM data and cybersecurity concepts and, therefore, security has thus far been overlooked. This paper reviews the empirical literature concerning IoT applications in the built environment and discusses real-world applications of the IoT intended to enhance construction practices, people’s lives and bolster cybersecurity. Specifically, this research addresses two research questions: (a) How suitable are the current IoT and CPS security stacks to address the cybersecurity threats facing digital twins in the context of smart buildings and districts? and (b) What are the current obstacles to tackling cybersecurity threats to the built environment CPS? To answer these questions, this paper reviews the current state-of-the-art research concerning digital twins in the built environment, the IoT, BIM, urban cities and cybersecurity. The results of the findings of this study confirmed the importance of using digital twins in both IoT and BIM. Also, eight reference zones across Europe have gained special recognition for their contributions to the advancement of IoT science. Therefore, this paper evaluates the use of digital twins in CPS to arrive at recommendations for expanding BIM specifications to facilitate IoT compliance, bolster cybersecurity and integrate digital twin and city standards in the smart cities of the future.
Keywords: BIM, cybersecurity, digital twins, IoT, urban cities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 882762 Automatic Detection and Classification of Diabetic Retinopathy Using Retinal Fundus Images
Authors: A. Biran, P. Sobhe Bidari, A. Almazroe V. Lakshminarayanan, K. Raahemifar
Abstract:
Diabetic Retinopathy (DR) is a severe retinal disease which is caused by diabetes mellitus. It leads to blindness when it progress to proliferative level. Early indications of DR are the appearance of microaneurysms, hemorrhages and hard exudates. In this paper, an automatic algorithm for detection of DR has been proposed. The algorithm is based on combination of several image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Also, Support Vector Machine (SVM) Classifier is used to classify retinal images to normal or abnormal cases including non-proliferative or proliferative DR. The proposed method has been tested on images selected from Structured Analysis of the Retinal (STARE) database using MATLAB code. The method is perfectly able to detect DR. The sensitivity specificity and accuracy of this approach are 90%, 87.5%, and 91.4% respectively.Keywords: Diabetic retinopathy, fundus images, STARE, Gabor filter, SVM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1669761 A Nonoblivious Image Watermarking System Based on Singular Value Decomposition and Texture Segmentation
Authors: Soroosh Rezazadeh, Mehran Yazdi
Abstract:
In this paper, a robust digital image watermarking scheme for copyright protection applications using the singular value decomposition (SVD) is proposed. In this scheme, an entropy masking model has been applied on the host image for the texture segmentation. Moreover, the local luminance and textures of the host image are considered for watermark embedding procedure to increase the robustness of the watermarking scheme. In contrast to all existing SVD-based watermarking systems that have been designed to embed visual watermarks, our system uses a pseudo-random sequence as a watermark. We have tested the performance of our method using a wide variety of image processing attacks on different test images. A comparison is made between the results of our proposed algorithm with those of a wavelet-based method to demonstrate the superior performance of our algorithm.Keywords: Watermarking, copyright protection, singular value decomposition, entropy masking, texture segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1765760 Over-Height Vehicle Detection in Low Headroom Roads Using Digital Video Processing
Authors: Vahid Khorramshahi, Alireza Behrad, Neeraj K. Kanhere
Abstract:
In this paper we present a new method for over-height vehicle detection in low headroom streets and highways using digital video possessing. The accuracy and the lower price comparing to present detectors like laser radars and the capability of providing extra information like speed and height measurement make this method more reliable and efficient. In this algorithm the features are selected and tracked using KLT algorithm. A blob extraction algorithm is also applied using background estimation and subtraction. Then the world coordinates of features that are inside the blobs are estimated using a noble calibration method. As, the heights of the features are calculated, we apply a threshold to select overheight features and eliminate others. The over-height features are segmented using some association criteria and grouped using an undirected graph. Then they are tracked through sequential frames. The obtained groups refer to over-height vehicles in a scene.Keywords: Feature extraction, over-height vehicle detection, traffic monitoring, vehicle tracking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2828759 Power Allocation in User-Centric Cell-Free Massive MIMO Systems with Limited Fronthaul Capacity
Authors: Siminfar Samakoush Galougah
Abstract:
In this paper, we study two power allocation problems for an uplink user-centric (UC) cell-free massive multiple-input multiple-output (CF-mMIMO) system. Besides, we assume each access point (AP) is connected to a central processing unit (CPU) via fronthaul link with limited capacity. To efficiently use the fronthaul capacity, two strategies for transmitting signals from APs to the CPU are employed; namely: compress-forward-estimate (CFE), estimate-compress-forward (ECF). The capacity of the aforementioned strategies in user-centric CF-mMIMO are drived. Then, we solved the two power allocation problems with minimum Spectral Efficiency (SE) and sum-SE maximization objectives for ECF and CFE strategies.
Keywords: Cell-free massive MIMO, limited capacity fronthaul, spectral efficiency, power allocation problem.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 72758 An Experimental Comparison of Unsupervised Learning Techniques for Face Recognition
Authors: Dinesh Kumar, C.S. Rai, Shakti Kumar
Abstract:
Face Recognition has always been a fascinating research area. It has drawn the attention of many researchers because of its various potential applications such as security systems, entertainment, criminal identification etc. Many supervised and unsupervised learning techniques have been reported so far. Principal Component Analysis (PCA), Self Organizing Maps (SOM) and Independent Component Analysis (ICA) are the three techniques among many others as proposed by different researchers for Face Recognition, known as the unsupervised techniques. This paper proposes integration of the two techniques, SOM and PCA, for dimensionality reduction and feature selection. Simulation results show that, though, the individual techniques SOM and PCA itself give excellent performance but the combination of these two can also be utilized for face recognition. Experimental results also indicate that for the given face database and the classifier used, SOM performs better as compared to other unsupervised learning techniques. A comparison of two proposed methodologies of SOM, Local and Global processing, shows the superiority of the later but at the cost of more computational time.
Keywords: Face Recognition, Principal Component Analysis, Self Organizing Maps, Independent Component Analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1880757 A BIM-Based Approach to Assess COVID-19 Risk Management Regarding Indoor Air Ventilation and Pedestrian Dynamics
Authors: T. Delval, C. Sauvage, Q. Jullien, R. Viano, T. Diallo, B. Collignan, G. Picinbono
Abstract:
In the context of the international spread of COVID-19, the Centre Scientifique et Technique du Bâtiment (CSTB) has led a joint research with the French government authorities Hauts-de-Seine department, to analyse the risk in school spaces according to their configuration, ventilation system and spatial segmentation strategy. This paper describes the main results of this joint research. A multidisciplinary team involving experts in indoor air quality/ventilation, pedestrian movements and IT domains was established to develop a COVID risk analysis tool based on Building Information Model. The work started with specific analysis on two pilot schools in order to provide for the local administration specifications to minimize the spread of the virus. Different recommendations were published to optimize/validate the use of ventilation systems and the strategy of student occupancy and student flow segmentation within the building. This COVID expertise has been digitized in order to manage a quick risk analysis on the entire building that could be used by the public administration through an easy user interface implemented in a free BIM Management software. One of the most interesting results is to enable a dynamic comparison of different ventilation system scenarios and space occupation strategy inside the BIM model. This concurrent engineering approach provides users with the optimal solution according to both ventilation and pedestrian flow expertise.
Keywords: BIM, knowledge management, system expert, risk management, indoor ventilation, pedestrian movement, integrated design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 763756 Thermoelectric Properties of Doped Polycrystalline Silicon Film
Authors: Li Long, Thomas Ortlepp
Abstract:
The transport properties of carriers in polycrystalline silicon film affect the performance of polycrystalline silicon-based devices. They depend strongly on the grain structure, grain boundary trap properties and doping concentration, which in turn are determined by the film deposition and processing conditions. Based on the properties of charge carriers, phonons, grain boundaries and their interactions, the thermoelectric properties of polycrystalline silicon are analyzed with the relaxation time approximation of the Boltzmann transport equation. With this approach, thermal conductivity, electrical conductivity and Seebeck coefficient as a function of grain size, trap properties and doping concentration can be determined. Experiment on heavily doped polycrystalline silicon is carried out and measurement results are compared with the model.
Keywords: Conductivity, polycrystalline silicon, relaxation time approximation, Seebeck coefficient, thermoelectric property.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 231755 Applied Actuator Fault Accommodation in Flight Control Systems Using Fault Reconstruction Based FDD and SMC Reconfiguration
Authors: A. Ghodbane, M. Saad, J.-F. Boland, C. Thibeault
Abstract:
Historically, actuators’ redundancy was used to deal with faults occurring suddenly in flight systems. This technique was generally expensive, time consuming and involves increased weight and space in the system. Therefore, nowadays, the on-line fault diagnosis of actuators and accommodation plays a major role in the design of avionic systems. These approaches, known as Fault Tolerant Flight Control systems (FTFCs) are able to adapt to such sudden faults while keeping avionics systems lighter and less expensive. In this paper, a (FTFC) system based on the Geometric Approach and a Reconfigurable Flight Control (RFC) are presented. The Geometric approach is used for cosmic ray fault reconstruction, while Sliding Mode Control (SMC) based on Lyapunov stability theory is designed for the reconfiguration of the controller in order to compensate the fault effect. Matlab®/Simulink® simulations are performed to illustrate the effectiveness and robustness of the proposed flight control system against actuators’ faulty signal caused by cosmic rays. The results demonstrate the successful real-time implementation of the proposed FTFC system on a non-linear 6 DOF aircraft model.
Keywords: Actuators’ faults, Fault detection and diagnosis, Fault tolerant flight control, Sliding mode control, Geometric approach for fault reconstruction, Lyapunov stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2576754 Central Finite Volume Methods Applied in Relativistic Magnetohydrodynamics: Applications in Disks and Jets
Authors: Raphael de Oliveira Garcia, Samuel Rocha de Oliveira
Abstract:
We have developed a new computer program in Fortran 90, in order to obtain numerical solutions of a system of Relativistic Magnetohydrodynamics partial differential equations with predetermined gravitation (GRMHD), capable of simulating the formation of relativistic jets from the accretion disk of matter up to his ejection. Initially we carried out a study on numerical methods of unidimensional Finite Volume, namely Lax-Friedrichs, Lax-Wendroff, Nessyahu-Tadmor method and Godunov methods dependent on Riemann problems, applied to equations Euler in order to verify their main features and make comparisons among those methods. It was then implemented the method of Finite Volume Centered of Nessyahu-Tadmor, a numerical schemes that has a formulation free and without dimensional separation of Riemann problem solvers, even in two or more spatial dimensions, at this point, already applied in equations GRMHD. Finally, the Nessyahu-Tadmor method was possible to obtain stable numerical solutions - without spurious oscillations or excessive dissipation - from the magnetized accretion disk process in rotation with respect to a central black hole (BH) Schwarzschild and immersed in a magnetosphere, for the ejection of matter in the form of jet over a distance of fourteen times the radius of the BH, a record in terms of astrophysical simulation of this kind. Also in our simulations, we managed to get substructures jets. A great advantage obtained was that, with the our code, we got simulate GRMHD equations in a simple personal computer.
Keywords: Finite Volume Methods, Central Schemes, Fortran 90, Relativistic Astrophysics, Jet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2324753 JConqurr - A Multi-Core Programming Toolkit for Java
Authors: G.A.C.P. Ganegoda, D.M.A. Samaranayake, L.S. Bandara, K.A.D.N.K. Wimalawarne
Abstract:
With the popularity of the multi-core and many-core architectures there is a great requirement for software frameworks which can support parallel programming methodologies. In this paper we introduce an Eclipse toolkit, JConqurr which is easy to use and provides robust support for flexible parallel progrmaming. JConqurr is a multi-core and many-core programming toolkit for Java which is capable of providing support for common parallel programming patterns which include task, data, divide and conquer and pipeline parallelism. The toolkit uses an annotation and a directive mechanism to convert the sequential code into parallel code. In addition to that we have proposed a novel mechanism to achieve the parallelism using graphical processing units (GPU). Experiments with common parallelizable algorithms have shown that our toolkit can be easily and efficiently used to convert sequential code to parallel code and significant performance gains can be achieved.
Keywords: Multi-core, parallel programming patterns, GPU, Java, Eclipse plugin, toolkit,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2111752 Stress Analysis of Hexagonal Element for Precast Concrete Pavements
Authors: J. Novak, A. Kohoutkova, V. Kristek, J. Vodicka, M. Sramek
Abstract:
While the use of cast-in-place concrete for an airfield and highway pavement overlay is very common, the application of precast concrete elements is very limited today. The main reasons consist of high production costs and complex structural behavior. Despite that, several precast concrete systems have been developed and tested with the aim to provide a system with rapid construction. The contribution deals with the reinforcement design of a hexagonal element developed for a proposed airfield pavement system. The sub-base course of the system is composed of compacted recycled concrete aggregates and fiber reinforced concrete with recycled aggregates place on top of it. The selected element belongs to a group of precast concrete elements which are being considered for the construction of a surface course. Both high costs of full-scale experiments and the need to investigate various elements force to simulate their behavior in a numerical analysis software by using finite element method instead of performing expensive experiments. The simulation of the selected element was conducted on a nonlinear model in order to obtain such results which could fully compensate results from experiments. The main objective was to design reinforcement of the precast concrete element subject to quasi-static loading from airplanes with respect to geometrical imperfections, manufacturing imperfections, tensile stress in reinforcement, compressive stress in concrete and crack width. The obtained findings demonstrate that the position and the presence of imperfection in a pavement highly affect the stress distribution in the precast concrete element. The precast concrete element should be heavily reinforced to fulfill all the demands. Using under-reinforced concrete elements would lead to the formation of wide cracks and cracks permanently open.
Keywords: Imperfection, numerical simulation, pavement, precast concrete element, reinforcement design, stress analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 761