Search results for: a priori algorithm
1519 Discovering the Effects of Meteorological Variables on the Air Quality of Bogota, Colombia, by Data Mining Techniques
Authors: Fabiana Franceschi, Martha Cobo, Manuel Figueredo
Abstract:
Bogotá, the capital of Colombia, is its largest city and one of the most polluted in Latin America due to the fast economic growth over the last ten years. Bogotá has been affected by high pollution events which led to the high concentration of PM10 and NO2, exceeding the local 24-hour legal limits (100 and 150 g/m3 each). The most important pollutants in the city are PM10 and PM2.5 (which are associated with respiratory and cardiovascular problems) and it is known that their concentrations in the atmosphere depend on the local meteorological factors. Therefore, it is necessary to establish a relationship between the meteorological variables and the concentrations of the atmospheric pollutants such as PM10, PM2.5, CO, SO2, NO2 and O3. This study aims to determine the interrelations between meteorological variables and air pollutants in Bogotá, using data mining techniques. Data from 13 monitoring stations were collected from the Bogotá Air Quality Monitoring Network within the period 2010-2015. The Principal Component Analysis (PCA) algorithm was applied to obtain primary relations between all the parameters, and afterwards, the K-means clustering technique was implemented to corroborate those relations found previously and to find patterns in the data. PCA was also used on a per shift basis (morning, afternoon, night and early morning) to validate possible variation of the previous trends and a per year basis to verify that the identified trends have remained throughout the study time. Results demonstrated that wind speed, wind direction, temperature, and NO2 are the most influencing factors on PM10 concentrations. Furthermore, it was confirmed that high humidity episodes increased PM2,5 levels. It was also found that there are direct proportional relationships between O3 levels and wind speed and radiation, while there is an inverse relationship between O3 levels and humidity. Concentrations of SO2 increases with the presence of PM10 and decreases with the wind speed and wind direction. They proved as well that there is a decreasing trend of pollutant concentrations over the last five years. Also, in rainy periods (March-June and September-December) some trends regarding precipitations were stronger. Results obtained with K-means demonstrated that it was possible to find patterns on the data, and they also showed similar conditions and data distribution among Carvajal, Tunal and Puente Aranda stations, and also between Parque Simon Bolivar and las Ferias. It was verified that the aforementioned trends prevailed during the study period by applying the same technique per year. It was concluded that PCA algorithm is useful to establish preliminary relationships among variables, and K-means clustering to find patterns in the data and understanding its distribution. The discovery of patterns in the data allows using these clusters as an input to an Artificial Neural Network prediction model.Keywords: air pollution, air quality modelling, data mining, particulate matter
Procedia PDF Downloads 2591518 Study on Network-Based Technology for Detecting Potentially Malicious Websites
Authors: Byung-Ik Kim, Hong-Koo Kang, Tae-Jin Lee, Hae-Ryong Park
Abstract:
Cyber terrors against specific enterprises or countries have been increasing recently. Such attacks against specific targets are called advanced persistent threat (APT), and they are giving rise to serious social problems. The malicious behaviors of APT attacks mostly affect websites and penetrate enterprise networks to perform malevolent acts. Although many enterprises invest heavily in security to defend against such APT threats, they recognize the APT attacks only after the latter are already in action. This paper discusses the characteristics of APT attacks at each step as well as the strengths and weaknesses of existing malicious code detection technologies to check their suitability for detecting APT attacks. It then proposes a network-based malicious behavior detection algorithm to protect the enterprise or national networks.Keywords: Advanced Persistent Threat (APT), malware, network security, network packet, exploit kits
Procedia PDF Downloads 3701517 Study on the Efficient Routing Algorithms in Delay-Tolerant Networks
Authors: Si-Gwan Kim
Abstract:
In Delay Tolerant Networks (DTN), there may not exist an end-to-end path between source and destination at the time of message transmission. Employing ‘Store Carry and Forward’ delivery mechanism for message transmission in such networks usually incurs long message delays. In this paper, we present the modified Binary Spray and Wait (BSW) routing protocol that enhances the performance of the original one. Our proposed algorithm adjusts the number of forward messages depending on the number of neighbor nodes. By using beacon messages periodically, the number of neighbor nodes can be managed. The simulation using ONE simulator results shows that our modified version gives higher delivery ratio and less latency as compared to BSW.Keywords: delay tolerant networks, store carry and forward, one simulator, binary spray and wait
Procedia PDF Downloads 1261516 Real Time Detection, Prediction and Reconstitution of Rain Drops
Authors: R. Burahee, B. Chassinat, T. de Laclos, A. Dépée, A. Sastim
Abstract:
The purpose of this paper is to propose a solution to detect, predict and reconstitute rain drops in real time – during the night – using an embedded material with an infrared camera. To prevent the system from needing too high hardware resources, simple models are considered in a powerful image treatment algorithm reducing considerably calculation time in OpenCV software. Using a smart model – drops will be matched thanks to a process running through two consecutive pictures for implementing a sophisticated tracking system. With this system drops computed trajectory gives information for predicting their future location. Thanks to this technique, treatment part can be reduced. The hardware system composed by a Raspberry Pi is optimized to host efficiently this code for real time execution.Keywords: reconstitution, prediction, detection, rain drop, real time, raspberry, infrared
Procedia PDF Downloads 4201515 The Influence of α-Defensin and Cytokine IL-1β, Molecular Factors of Innate Immune System, on Regulation of Inflammatory Periodontal Diseases in Orthodontic Patients
Authors: G. R. Khaliullina, S. L. Blashkova, I. G. Mustafin
Abstract:
The article presents the results of a study involving 97 patients with different types of orthodontic pathology. Immunological examination of patients included determination of the level of α-defensin and cytokine IL-1β in mixed saliva. The study showed that the level of α-defensin serves as a diagnostic marker for determining the therapeutic measures in the treatment of inflammatory processes in periodontal tissues. Α-defensins exhibit immunomodulating and antimicrobial activity during inflammatory processes and play an important role in the regulation of the pathology of periodontal disease. The obtained data allowed the development of an algorithm for diagnosis and the implementation of immunomodulating therapy in the treatment of periodontal diseases in orthodontic patients.Keywords: α-difensin, cytokine, orthodontic treatment, periodontal disease, periodontal pathogens
Procedia PDF Downloads 1811514 Improvement Perturb and Observe for a Fast Response MPPT Applied to Photovoltaic Panel
Authors: Labar Hocine, Kelaiaia Mounia Samira, Mesbah Tarek, Kelaiaia Samia
Abstract:
Maximum power point tracking (MPPT) techniques are used in photovoltaic (PV) systems to maximize the PV array output power by tracking continuously the maximum power point(MPP) which depends on panels temperature and on irradiance conditions. The main drawback of P&O is that, the operating point oscillates around the MPP giving rise to the waste of some amount of available energy; moreover, it is well known that the P&O algorithm can be confused during those time intervals characterized by rapidly changing atmospheric conditions. In this paper, it is shown that in order to limit the negative effects associated to the above drawbacks, the P&O MPPT parameters must be customized to the dynamic behavior of the specific converter adopted. A theoretical analysis allowing the optimal choice of such initial set parameters is also carried out. The fast convergence of the proposal is proven.Keywords: P&O, Taylor’s series, MPPT, photovoltaic panel
Procedia PDF Downloads 5871513 Improved Imaging and Tracking Algorithm for Maneuvering Extended UAVs Using High-Resolution ISAR Radar System
Authors: Mohamed Barbary, Mohamed H. Abd El-Azeem
Abstract:
Maneuvering extended object tracking (M-EOT) using high-resolution inverse synthetic aperture radar (ISAR) observations has been gaining momentum recently. This work presents a new robust implementation of the multiple models (MM) multi-Bernoulli (MB) filter for M-EOT, where the M-EOT’s ISAR observations are characterized using a skewed (SK) non-symmetrically normal distribution. To cope with the possible abrupt change of kinematic state, extension, and observation distribution over an extended object when a target maneuvers, a multiple model technique is represented based on MB-track-before-detect (TBD) filter supported by SK-sub-random matrix model (RMM) or sub-ellipses framework. Simulation results demonstrate this remarkable impact.Keywords: maneuvering extended objects, ISAR, skewed normal distribution, sub-RMM, MM-MB-TBD filter
Procedia PDF Downloads 781512 Towards a Resources Provisioning for Dynamic Workflows in the Cloud
Authors: Fairouz Fakhfakh, Hatem Hadj Kacem, Ahmed Hadj Kacem
Abstract:
Cloud computing offers a new model of service provisioning for workflow applications, thanks to its elasticity and its paying model. However, it presents various challenges that need to be addressed in order to be efficiently utilized. The resources provisioning problem for workflow applications has been widely studied. Nevertheless, the existing works did not consider the change in workflow instances while they are being executed. This functionality has become a major requirement to deal with unusual situations and evolution. This paper presents a first step towards the resources provisioning for a dynamic workflow. In fact, we propose a provisioning algorithm which minimizes the overall workflow execution cost, while meeting a deadline constraint. Then, we extend it to support the dynamic adding of tasks. Experimental results show that our proposed heuristic demonstrates a significant reduction in resources cost by using a consolidation process.Keywords: cloud computing, resources provisioning, dynamic workflow, workflow applications
Procedia PDF Downloads 2961511 Temperature Contour Detection of Salt Ice Using Color Thermal Image Segmentation Method
Authors: Azam Fazelpour, Saeed Reza Dehghani, Vlastimil Masek, Yuri S. Muzychka
Abstract:
The study uses a novel image analysis based on thermal imaging to detect temperature contours created on salt ice surface during transient phenomena. Thermal cameras detect objects by using their emissivities and IR radiance. The ice surface temperature is not uniform during transient processes. The temperature starts to increase from the boundary of ice towards the center of that. Thermal cameras are able to report temperature changes on the ice surface at every individual moment. Various contours, which show different temperature areas, appear on the ice surface picture captured by a thermal camera. Identifying the exact boundary of these contours is valuable to facilitate ice surface temperature analysis. Image processing techniques are used to extract each contour area precisely. In this study, several pictures are recorded while the temperature is increasing throughout the ice surface. Some pictures are selected to be processed by a specific time interval. An image segmentation method is applied to images to determine the contour areas. Color thermal images are used to exploit the main information. Red, green and blue elements of color images are investigated to find the best contour boundaries. The algorithms of image enhancement and noise removal are applied to images to obtain a high contrast and clear image. A novel edge detection algorithm based on differences in the color of the pixels is established to determine contour boundaries. In this method, the edges of the contours are obtained according to properties of red, blue and green image elements. The color image elements are assessed considering their information. Useful elements proceed to process and useless elements are removed from the process to reduce the consuming time. Neighbor pixels with close intensities are assigned in one contour and differences in intensities determine boundaries. The results are then verified by conducting experimental tests. An experimental setup is performed using ice samples and a thermal camera. To observe the created ice contour by the thermal camera, the samples, which are initially at -20° C, are contacted with a warmer surface. Pictures are captured for 20 seconds. The method is applied to five images ,which are captured at the time intervals of 5 seconds. The study shows the green image element carries no useful information; therefore, the boundary detection method is applied on red and blue image elements. In this case study, the results indicate that proposed algorithm shows the boundaries more effective than other edges detection methods such as Sobel and Canny. Comparison between the contour detection in this method and temperature analysis, which states real boundaries, shows a good agreement. This color image edge detection method is applicable to other similar cases according to their image properties.Keywords: color image processing, edge detection, ice contour boundary, salt ice, thermal image
Procedia PDF Downloads 3151510 A New Floating Point Implementation of Base 2 Logarithm
Authors: Ahmed M. Mansour, Ali M. El-Sawy, Ahmed T. Sayed
Abstract:
Logarithms reduce products to sums and powers to products; they play an important role in signal processing, communication and information theory. They are primarily used for hardware calculations, handling multiplications, divisions, powers, and roots effectively. There are three commonly used bases for logarithms; the logarithm with base-10 is called the common logarithm, the natural logarithm with base-e and the binary logarithm with base-2. This paper demonstrates different methods of calculation for log2 showing the complexity of each and finds out the most accurate and efficient besides giving in- sights to their hardware design. We present a new method called Floor Shift for fast calculation of log2, and then we combine this algorithm with Taylor series to improve the accuracy of the output, we illustrate that by using two examples. We finally compare the algorithms and conclude with our remarks.Keywords: logarithms, log2, floor, iterative, CORDIC, Taylor series
Procedia PDF Downloads 5361509 Mobile Application Tool for Individual Maintenance Users on High-Rise Residential Buildings in South Korea
Authors: H. Cha, J. Kim, D. Kim, J. Shin, K. Lee
Abstract:
Since 1980's, the rapid economic growth resulted in so many aged apartment buildings in South Korea. Nevertheless, there is insufficient maintenance practice of buildings. In this study, to facilitate the building maintenance the authors classified the building defects into three levels according to their level of performance and developed a mobile application tool based on each level's appropriate feedback. The feedback structure consisted of 'Maintenance manual phase', 'Online feedback phase', 'Repair work phase of the specialty contractors'. In order to implement each phase the authors devised the necessary database for each phase and created a prototype system that can develop on its own. The authors expect that the building users can easily maintain their buildings by using this application.Keywords: building defect, maintenance practice, mobile application, system algorithm
Procedia PDF Downloads 1891508 Method of Cluster Based Cross-Domain Knowledge Acquisition for Biologically Inspired Design
Authors: Shen Jian, Hu Jie, Ma Jin, Peng Ying Hong, Fang Yi, Liu Wen Hai
Abstract:
Biologically inspired design inspires inventions and new technologies in the field of engineering by mimicking functions, principles, and structures in the biological domain. To deal with the obstacles of cross-domain knowledge acquisition in the existing biologically inspired design process, functional semantic clustering based on functional feature semantic correlation and environmental constraint clustering composition based on environmental characteristic constraining adaptability are proposed. A knowledge cell clustering algorithm and the corresponding prototype system is developed. Finally, the effectiveness of the method is verified by the visual prosthetic device design.Keywords: knowledge clustering, knowledge acquisition, knowledge based engineering, knowledge cell, biologically inspired design
Procedia PDF Downloads 4281507 Solving Process Planning and Scheduling with Number of Operation Plus Processing Time Due-Date Assignment Concurrently Using a Genetic Search
Authors: Halil Ibrahim Demir, Alper Goksu, Onur Canpolat, Caner Erden, Melek Nur
Abstract:
Traditionally process planning, scheduling and due date assignment are performed sequentially and separately. High interrelation between these functions makes integration very useful. Although there are numerous works on integrated process planning and scheduling and many works on scheduling with due date assignment, there are only a few works on the integration of these three functions. Here we tested the different integration levels of these three functions and found a fully integrated version as the best. We applied genetic search and random search and genetic search was found better compared to the random search. We penalized all earliness, tardiness and due date related costs. Since all these three terms are all undesired, it is better to penalize all of them.Keywords: process planning, scheduling, due-date assignment, genetic algorithm, random search
Procedia PDF Downloads 3751506 High Capacity Reversible Watermarking through Interpolated Error Shifting
Authors: Hae-Yeoun Lee
Abstract:
Reversible watermarking that not only protects the copyright but also preserve the original quality of the digital content have been intensively studied. In particular, the demand for reversible watermarking has increased. In this paper, we propose a reversible watermarking scheme based on interpolation-error shifting and error precompensation. The intensity of a pixel is interpolated from the intensities of neighbouring pixels, and the difference histogram between the interpolated and the original intensities is obtained and modified to embed the watermark message. By restoring the difference histogram, the embedded watermark is extracted and the original image is recovered by compensating for the interpolation error. The overflow and underflow are prevented by error precompensation. To show the performance of the method, the proposed algorithm is compared with other methods using various test images.Keywords: reversible watermarking, high capacity, high quality, interpolated error shifting, error precompensation
Procedia PDF Downloads 3251505 Deployment of Matrix Transpose in Digital Image Encryption
Authors: Okike Benjamin, Garba E J. D.
Abstract:
Encryption is used to conceal information from prying eyes. Presently, information and data encryption are common due to the volume of data and information in transit across the globe on daily basis. Image encryption is yet to receive the attention of the researchers as deserved. In other words, video and multimedia documents are exposed to unauthorized accessors. The authors propose image encryption using matrix transpose. An algorithm that would allow image encryption is developed. In this proposed image encryption technique, the image to be encrypted is split into parts based on the image size. Each part is encrypted separately using matrix transpose. The actual encryption is on the picture elements (pixel) that make up the image. After encrypting each part of the image, the positions of the encrypted images are swapped before transmission of the image can take place. Swapping the positions of the images is carried out to make the encrypted image more robust for any cryptanalyst to decrypt.Keywords: image encryption, matrices, pixel, matrix transpose
Procedia PDF Downloads 4231504 Optimal Maintenance and Improvement Policies in Water Distribution System: Markov Decision Process Approach
Authors: Jong Woo Kim, Go Bong Choi, Sang Hwan Son, Dae Shik Kim, Jung Chul Suh, Jong Min Lee
Abstract:
The Markov Decision Process (MDP) based methodology is implemented in order to establish the optimal schedule which minimizes the cost. Formulation of MDP problem is presented using the information about the current state of pipe, improvement cost, failure cost and pipe deterioration model. The objective function and detailed algorithm of dynamic programming (DP) are modified due to the difficulty of implementing the conventional DP approaches. The optimal schedule derived from suggested model is compared to several policies via Monte Carlo simulation. Validity of the solution and improvement in computational time are proved.Keywords: Markov decision processes, dynamic programming, Monte Carlo simulation, periodic replacement, Weibull distribution
Procedia PDF Downloads 4251503 Aerobic Bioprocess Control Using Artificial Intelligence Techniques
Authors: M. Caramihai, Irina Severin
Abstract:
This paper deals with the design of an intelligent control structure for a bioprocess of Hansenula polymorpha yeast cultivation. The objective of the process control is to produce biomass in a desired physiological state. The work demonstrates that the designed Hybrid Control Techniques (HCT) are able to recognize specific evolution bioprocess trajectories using neural networks trained specifically for this purpose, in order to estimate the model parameters and to adjust the overall bioprocess evolution through an expert system and a fuzzy structure. The design of the control algorithm as well as its tuning through realistic simulations is presented. Taking into consideration the synergism of different paradigms like fuzzy logic, neural network, and symbolic artificial intelligence (AI), in this paper we present a real and fulfilled intelligent control architecture with application in bioprocess control.Keywords: bioprocess, intelligent control, neural nets, fuzzy structure, hybrid techniques
Procedia PDF Downloads 4251502 Channel Estimation for LTE Downlink
Authors: Rashi Jain
Abstract:
The LTE systems employ Orthogonal Frequency Division Multiplexing (OFDM) as the multiple access technology for the Downlink channels. For enhanced performance, accurate channel estimation is required. Various algorithms such as Least Squares (LS), Minimum Mean Square Error (MMSE) and Recursive Least Squares (RLS) can be employed for the purpose. The paper proposes channel estimation algorithm based on Kalman Filter for LTE-Downlink system. Using the frequency domain pilots, the initial channel response is obtained using the LS criterion. Then Kalman Filter is employed to track the channel variations in time-domain. To suppress the noise within a symbol, threshold processing is employed. The paper draws comparison between the LS, MMSE, RLS and Kalman filter for channel estimation. The parameters for evaluation are Bit Error Rate (BER), Mean Square Error (MSE) and run-time.Keywords: LTE, channel estimation, OFDM, RLS, Kalman filter, threshold
Procedia PDF Downloads 3591501 Black-Box-Base Generic Perturbation Generation Method under Salient Graphs
Authors: Dingyang Hu, Dan Liu
Abstract:
DNN (Deep Neural Network) deep learning models are widely used in classification, prediction, and other task scenarios. To address the difficulties of generic adversarial perturbation generation for deep learning models under black-box conditions, a generic adversarial ingestion generation method based on a saliency map (CJsp) is proposed to obtain salient image regions by counting the factors that influence the input features of an image on the output results. This method can be understood as a saliency map attack algorithm to obtain false classification results by reducing the weights of salient feature points. Experiments also demonstrate that this method can obtain a high success rate of migration attacks and is a batch adversarial sample generation method.Keywords: adversarial sample, gradient, probability, black box
Procedia PDF Downloads 1071500 Solving Optimal Control of Semilinear Elliptic Variational Inequalities Obstacle Problems using Smoothing Functions
Authors: El Hassene Osmani, Mounir Haddou, Naceurdine Bensalem
Abstract:
In this paper, we investigate optimal control problems governed by semilinear elliptic variational inequalities involving constraints on the state, and more precisely, the obstacle problem. We present a relaxed formulation for the problem using smoothing functions. Since we adopt a numerical point of view, we first relax the feasible domain of the problem, then using both mathematical programming methods and penalization methods, we get optimality conditions with smooth Lagrange multipliers. Some numerical experiments using IPOPT algorithm (Interior Point Optimizer) are presented to verify the efficiency of our approach.Keywords: complementarity problem, IPOPT, Lagrange multipliers, mathematical programming, optimal control, smoothing methods, variationally inequalities
Procedia PDF Downloads 1751499 Digital Cinema Watermarking State of Art and Comparison
Authors: H. Kelkoul, Y. Zaz
Abstract:
Nowadays, the vigorous popularity of video processing techniques has resulted in an explosive growth of multimedia data illegal use. So, watermarking security has received much more attention. The purpose of this paper is to explore some watermarking techniques in order to observe their specificities and select the finest methods to apply in digital cinema domain against movie piracy by creating an invisible watermark that includes the date, time and the place where the hacking was done. We have studied three principal watermarking techniques in the frequency domain: Spread spectrum, Wavelet transform domain and finally the digital cinema watermarking transform domain. In this paper, a detailed technique is presented where embedding is performed using direct sequence spread spectrum technique in DWT transform domain. Experiment results shows that the algorithm provides high robustness and good imperceptibility.Keywords: digital cinema, watermarking, wavelet DWT, spread spectrum, JPEG2000 MPEG4
Procedia PDF Downloads 2511498 Bayesian Structural Identification with Systematic Uncertainty Using Multiple Responses
Authors: André Jesus, Yanjie Zhu, Irwanda Laory
Abstract:
Structural health monitoring is one of the most promising technologies concerning aversion of structural risk and economic savings. Analysts often have to deal with a considerable variety of uncertainties that arise during a monitoring process. Namely the widespread application of numerical models (model-based) is accompanied by a widespread concern about quantifying the uncertainties prevailing in their use. Some of these uncertainties are related with the deterministic nature of the model (code uncertainty) others with the variability of its inputs (parameter uncertainty) and the discrepancy between a model/experiment (systematic uncertainty). The actual process always exhibits a random behaviour (observation error) even when conditions are set identically (residual variation). Bayesian inference assumes that parameters of a model are random variables with an associated PDF, which can be inferred from experimental data. However in many Bayesian methods the determination of systematic uncertainty can be problematic. In this work systematic uncertainty is associated with a discrepancy function. The numerical model and discrepancy function are approximated by Gaussian processes (surrogate model). Finally, to avoid the computational burden of a fully Bayesian approach the parameters that characterise the Gaussian processes were estimated in a four stage process (modular Bayesian approach). The proposed methodology has been successfully applied on fields such as geoscience, biomedics, particle physics but never on the SHM context. This approach considerably reduces the computational burden; although the extent of the considered uncertainties is lower (second order effects are neglected). To successfully identify the considered uncertainties this formulation was extended to consider multiple responses. The efficiency of the algorithm has been tested on a small scale aluminium bridge structure, subjected to a thermal expansion due to infrared heaters. Comparison of its performance with responses measured at different points of the structure and associated degrees of identifiability is also carried out. A numerical FEM model of the structure was developed and the stiffness from its supports is considered as a parameter to calibrate. Results show that the modular Bayesian approach performed best when responses of the same type had the lowest spatial correlation. Based on previous literature, using different types of responses (strain, acceleration, and displacement) should also improve the identifiability problem. Uncertainties due to parametric variability, observation error, residual variability, code variability and systematic uncertainty were all recovered. For this example the algorithm performance was stable and considerably quicker than Bayesian methods that account for the full extent of uncertainties. Future research with real-life examples is required to fully access the advantages and limitations of the proposed methodology.Keywords: bayesian, calibration, numerical model, system identification, systematic uncertainty, Gaussian process
Procedia PDF Downloads 3291497 Road Vehicle Recognition Using Magnetic Sensing Feature Extraction and Classification
Authors: Xiao Chen, Xiaoying Kong, Min Xu
Abstract:
This paper presents a road vehicle detection approach for the intelligent transportation system. This approach mainly uses low-cost magnetic sensor and associated data collection system to collect magnetic signals. This system can measure the magnetic field changing, and it also can detect and count vehicles. We extend Mel Frequency Cepstral Coefficients to analyze vehicle magnetic signals. Vehicle type features are extracted using representation of cepstrum, frame energy, and gap cepstrum of magnetic signals. We design a 2-dimensional map algorithm using Vector Quantization to classify vehicle magnetic features to four typical types of vehicles in Australian suburbs: sedan, VAN, truck, and bus. Experiments results show that our approach achieves a high level of accuracy for vehicle detection and classification.Keywords: vehicle classification, signal processing, road traffic model, magnetic sensing
Procedia PDF Downloads 3221496 Research on ARQ Transmission Technique in Mars Detection Telecommunications System
Authors: Zhongfei Cai, Hui He, Changsheng Li
Abstract:
This paper studied in the automatic repeat request (ARQ) transmission technique in Mars detection telecommunications system. An ARQ method applied to proximity-1 space link protocol was proposed by this paper. In order to ensure the efficiency of data reliable transmission, this ARQ method combined these different ARQ maneuvers characteristics. Considering the Mars detection communication environments, this paper analyzed the characteristics of the saturation throughput rate, packet dropping probability, average delay and energy efficiency with different ARQ algorithms. Combined thus results with the theories of ARQ transmission technique, an ARQ transmission project in Mars detection telecommunications system was established. The simulation results showed that this algorithm had excellent saturation throughput rate and energy efficiency with low complexity.Keywords: ARQ, mars, CCSDS, proximity-1, deepspace
Procedia PDF Downloads 3421495 Density-based Denoising of Point Cloud
Authors: Faisal Zaman, Ya Ping Wong, Boon Yian Ng
Abstract:
Point cloud source data for surface reconstruction is usually contaminated with noise and outliers. To overcome this, we present a novel approach using modified kernel density estimation (KDE) technique with bilateral filtering to remove noisy points and outliers. First we present a method for estimating optimal bandwidth of multivariate KDE using particle swarm optimization technique which ensures the robust performance of density estimation. Then we use mean-shift algorithm to find the local maxima of the density estimation which gives the centroid of the clusters. Then we compute the distance of a certain point from the centroid. Points belong to outliers then removed by automatic thresholding scheme which yields an accurate and economical point surface. The experimental results show that our approach comparably robust and efficient.Keywords: point preprocessing, outlier removal, surface reconstruction, kernel density estimation
Procedia PDF Downloads 3491494 New Approaches for the Handwritten Digit Image Features Extraction for Recognition
Authors: U. Ravi Babu, Mohd Mastan
Abstract:
The present paper proposes a novel approach for handwritten digit recognition system. The present paper extract digit image features based on distance measure and derives an algorithm to classify the digit images. The distance measure can be performing on the thinned image. Thinning is the one of the preprocessing technique in image processing. The present paper mainly concentrated on an extraction of features from digit image for effective recognition of the numeral. To find the effectiveness of the proposed method tested on MNIST database, CENPARMI, CEDAR, and newly collected data. The proposed method is implemented on more than one lakh digit images and it gets good comparative recognition results. The percentage of the recognition is achieved about 97.32%.Keywords: handwritten digit recognition, distance measure, MNIST database, image features
Procedia PDF Downloads 4641493 Image Compression Using Block Power Method for SVD Decomposition
Authors: El Asnaoui Khalid, Chawki Youness, Aksasse Brahim, Ouanan Mohammed
Abstract:
In these recent decades, the important and fast growth in the development and demand of multimedia products is contributing to an insufficient in the bandwidth of device and network storage memory. Consequently, the theory of data compression becomes more significant for reducing the data redundancy in order to save more transfer and storage of data. In this context, this paper addresses the problem of the lossless and the near-lossless compression of images. This proposed method is based on Block SVD Power Method that overcomes the disadvantages of Matlab's SVD function. The experimental results show that the proposed algorithm has a better compression performance compared with the existing compression algorithms that use the Matlab's SVD function. In addition, the proposed approach is simple and can provide different degrees of error resilience, which gives, in a short execution time, a better image compression.Keywords: image compression, SVD, block SVD power method, lossless compression, near lossless
Procedia PDF Downloads 3881492 Optimal Production and Maintenance Policy for a Partially Observable Production System with Stochastic Demand
Authors: Leila Jafari, Viliam Makis
Abstract:
In this paper, the joint optimization of the economic manufacturing quantity (EMQ), safety stock level, and condition-based maintenance (CBM) is presented for a partially observable, deteriorating system subject to random failure. The demand is stochastic and it is described by a Poisson process. The stochastic model is developed and the optimization problem is formulated in the semi-Markov decision process framework. A modification of the policy iteration algorithm is developed to find the optimal policy. A numerical example is presented to compare the optimal policy with the policy considering zero safety stock.Keywords: condition-based maintenance, economic manufacturing quantity, safety stock, stochastic demand
Procedia PDF Downloads 4671491 Developing a Framework for Sustainable Social Housing Delivery in Greater Port Harcourt City Rivers State, Nigeria
Authors: Enwin Anthony Dornubari, Visigah Kpobari Peter
Abstract:
This research has developed a framework for the provision of sustainable and affordable housing to accommodate the low-income population of Greater Port Harcourt City. The objectives of this study among others, were to: examine UN-Habitat guidelines for acceptable and sustainable social housing provision, describe past efforts of the Rivers State Government and the Federal Government of Nigeria to provide housing for the poor in the Greater Port Harcourt City area; obtain a profile of prospective beneficiaries of the social housing proposed by this research as well as perceptions of their present living conditions, and living in the proposed self-sustaining social housing development, based on the initial simulation of the proposal; describe the nature of the framework, guideline and management of the proposed social housing development and explain the modalities for its implementation. The study utilized the mixed methods research approach, aimed at triangulating findings from the quantitative and qualitative paradigms. Opinions of professional of the built environment; Director, Development Control, Greater Port Harcourt City Development Authority; Directors of Ministry of Urban Development and Physical Planning; Housing and Property Development Authority and managers of selected Primary Mortgage Institutions were sought and analyzed. There were four target populations for the study, namely: members of occupational sub-groups for FGDs (Focused Group Discussions); development professionals for KIIs (Key Informant Interviews), household heads in selected communities of GPHC; and relevant public officials for IDI (Individual Depth Interview). Focus Group Discussions (FGDs) were held with members of occupational sub-groups in each of the eight selected communities (Fisherfolk). The table shows that there were forty (40) members across all occupational sub-groups in each selected community, yielding a total of 320 in the eight (8) communities of Mgbundukwu (Mile 2 Diobu), Rumuodomaya, Abara (Etche), Igwuruta-Ali(Ikwerre), Wakama(Ogu-Bolo), Okujagu (Okrika), Akpajo (Eleme), and Okoloma (Oyigbo). For key informant interviews, two (2) members were judgmentally selected from each of the following development professions: urban and regional planners; architects; estate surveyors; land surveyors; quantity surveyors; and engineers. Concerning Population 3-Household Heads in Selected Communities of GPHC, a stratified multi-stage sampling procedure was adopted: Stage 1-Obtaining a 10% (a priori decision) sample of the component communities of GPHC in each stratum. The number in each stratum was rounded to one whole number to ensure representation of each stratum. Stage 2-Obtaining the number of households to be studied after applying the Taro Yamane formula, which aided in determining the appropriate number of cases to be studied at the precision level of 5%. Findings revealed, amongst others, that poor implementation of the UN-Habitat global shelter strategy, lack of stakeholder engagement, inappropriate locations, undue bureaucracy, lack of housing fairness and equity and high cost of land and building materials were the reasons for the failure of past efforts towards social housing provision in the Greater Port Harcourt City area. The study recommended a public-private partnership approach for the implementation and management of the framework. It also recommended a robust and sustained relationship between the management of the framework and the UN-Habitat office and other relevant government agencies responsible for housing development and all investment partners to create trust and efficiency.Keywords: development, framework, low-income, sustainable, social housing
Procedia PDF Downloads 2551490 Numerical Simulation of Rayleigh Benard Convection and Radiation Heat Transfer in Two-Dimensional Enclosure
Authors: Raoudha Chaabane, Faouzi Askri, Sassi Ben Nasrallah
Abstract:
A new numerical algorithm is developed to solve coupled convection-radiation heat transfer in a two dimensional enclosure. Radiative heat transfer in participating medium has been carried out using the control volume finite element method (CVFEM). The radiative transfer equations (RTE) are formulated for absorbing, emitting and scattering medium. The density, velocity and temperature fields are calculated using the two double population lattice Boltzmann equation (LBE). In order to test the efficiency of the developed method the Rayleigh Benard convection with and without radiative heat transfer is analyzed. The obtained results are validated against available works in literature and the proposed method is found to be efficient, accurate and numerically stable.Keywords: participating media, LBM, CVFEM- radiation coupled with convection
Procedia PDF Downloads 409