Search results for: evolved bat algorithm
1609 Discovering the Effects of Meteorological Variables on the Air Quality of Bogota, Colombia, by Data Mining Techniques
Authors: Fabiana Franceschi, Martha Cobo, Manuel Figueredo
Abstract:
Bogotá, the capital of Colombia, is its largest city and one of the most polluted in Latin America due to the fast economic growth over the last ten years. Bogotá has been affected by high pollution events which led to the high concentration of PM10 and NO2, exceeding the local 24-hour legal limits (100 and 150 g/m3 each). The most important pollutants in the city are PM10 and PM2.5 (which are associated with respiratory and cardiovascular problems) and it is known that their concentrations in the atmosphere depend on the local meteorological factors. Therefore, it is necessary to establish a relationship between the meteorological variables and the concentrations of the atmospheric pollutants such as PM10, PM2.5, CO, SO2, NO2 and O3. This study aims to determine the interrelations between meteorological variables and air pollutants in Bogotá, using data mining techniques. Data from 13 monitoring stations were collected from the Bogotá Air Quality Monitoring Network within the period 2010-2015. The Principal Component Analysis (PCA) algorithm was applied to obtain primary relations between all the parameters, and afterwards, the K-means clustering technique was implemented to corroborate those relations found previously and to find patterns in the data. PCA was also used on a per shift basis (morning, afternoon, night and early morning) to validate possible variation of the previous trends and a per year basis to verify that the identified trends have remained throughout the study time. Results demonstrated that wind speed, wind direction, temperature, and NO2 are the most influencing factors on PM10 concentrations. Furthermore, it was confirmed that high humidity episodes increased PM2,5 levels. It was also found that there are direct proportional relationships between O3 levels and wind speed and radiation, while there is an inverse relationship between O3 levels and humidity. Concentrations of SO2 increases with the presence of PM10 and decreases with the wind speed and wind direction. They proved as well that there is a decreasing trend of pollutant concentrations over the last five years. Also, in rainy periods (March-June and September-December) some trends regarding precipitations were stronger. Results obtained with K-means demonstrated that it was possible to find patterns on the data, and they also showed similar conditions and data distribution among Carvajal, Tunal and Puente Aranda stations, and also between Parque Simon Bolivar and las Ferias. It was verified that the aforementioned trends prevailed during the study period by applying the same technique per year. It was concluded that PCA algorithm is useful to establish preliminary relationships among variables, and K-means clustering to find patterns in the data and understanding its distribution. The discovery of patterns in the data allows using these clusters as an input to an Artificial Neural Network prediction model.Keywords: air pollution, air quality modelling, data mining, particulate matter
Procedia PDF Downloads 2581608 Study on Network-Based Technology for Detecting Potentially Malicious Websites
Authors: Byung-Ik Kim, Hong-Koo Kang, Tae-Jin Lee, Hae-Ryong Park
Abstract:
Cyber terrors against specific enterprises or countries have been increasing recently. Such attacks against specific targets are called advanced persistent threat (APT), and they are giving rise to serious social problems. The malicious behaviors of APT attacks mostly affect websites and penetrate enterprise networks to perform malevolent acts. Although many enterprises invest heavily in security to defend against such APT threats, they recognize the APT attacks only after the latter are already in action. This paper discusses the characteristics of APT attacks at each step as well as the strengths and weaknesses of existing malicious code detection technologies to check their suitability for detecting APT attacks. It then proposes a network-based malicious behavior detection algorithm to protect the enterprise or national networks.Keywords: Advanced Persistent Threat (APT), malware, network security, network packet, exploit kits
Procedia PDF Downloads 3651607 Study on the Efficient Routing Algorithms in Delay-Tolerant Networks
Authors: Si-Gwan Kim
Abstract:
In Delay Tolerant Networks (DTN), there may not exist an end-to-end path between source and destination at the time of message transmission. Employing ‘Store Carry and Forward’ delivery mechanism for message transmission in such networks usually incurs long message delays. In this paper, we present the modified Binary Spray and Wait (BSW) routing protocol that enhances the performance of the original one. Our proposed algorithm adjusts the number of forward messages depending on the number of neighbor nodes. By using beacon messages periodically, the number of neighbor nodes can be managed. The simulation using ONE simulator results shows that our modified version gives higher delivery ratio and less latency as compared to BSW.Keywords: delay tolerant networks, store carry and forward, one simulator, binary spray and wait
Procedia PDF Downloads 1231606 Real Time Detection, Prediction and Reconstitution of Rain Drops
Authors: R. Burahee, B. Chassinat, T. de Laclos, A. Dépée, A. Sastim
Abstract:
The purpose of this paper is to propose a solution to detect, predict and reconstitute rain drops in real time – during the night – using an embedded material with an infrared camera. To prevent the system from needing too high hardware resources, simple models are considered in a powerful image treatment algorithm reducing considerably calculation time in OpenCV software. Using a smart model – drops will be matched thanks to a process running through two consecutive pictures for implementing a sophisticated tracking system. With this system drops computed trajectory gives information for predicting their future location. Thanks to this technique, treatment part can be reduced. The hardware system composed by a Raspberry Pi is optimized to host efficiently this code for real time execution.Keywords: reconstitution, prediction, detection, rain drop, real time, raspberry, infrared
Procedia PDF Downloads 4171605 The Influence of α-Defensin and Cytokine IL-1β, Molecular Factors of Innate Immune System, on Regulation of Inflammatory Periodontal Diseases in Orthodontic Patients
Authors: G. R. Khaliullina, S. L. Blashkova, I. G. Mustafin
Abstract:
The article presents the results of a study involving 97 patients with different types of orthodontic pathology. Immunological examination of patients included determination of the level of α-defensin and cytokine IL-1β in mixed saliva. The study showed that the level of α-defensin serves as a diagnostic marker for determining the therapeutic measures in the treatment of inflammatory processes in periodontal tissues. Α-defensins exhibit immunomodulating and antimicrobial activity during inflammatory processes and play an important role in the regulation of the pathology of periodontal disease. The obtained data allowed the development of an algorithm for diagnosis and the implementation of immunomodulating therapy in the treatment of periodontal diseases in orthodontic patients.Keywords: α-difensin, cytokine, orthodontic treatment, periodontal disease, periodontal pathogens
Procedia PDF Downloads 1771604 Improvement Perturb and Observe for a Fast Response MPPT Applied to Photovoltaic Panel
Authors: Labar Hocine, Kelaiaia Mounia Samira, Mesbah Tarek, Kelaiaia Samia
Abstract:
Maximum power point tracking (MPPT) techniques are used in photovoltaic (PV) systems to maximize the PV array output power by tracking continuously the maximum power point(MPP) which depends on panels temperature and on irradiance conditions. The main drawback of P&O is that, the operating point oscillates around the MPP giving rise to the waste of some amount of available energy; moreover, it is well known that the P&O algorithm can be confused during those time intervals characterized by rapidly changing atmospheric conditions. In this paper, it is shown that in order to limit the negative effects associated to the above drawbacks, the P&O MPPT parameters must be customized to the dynamic behavior of the specific converter adopted. A theoretical analysis allowing the optimal choice of such initial set parameters is also carried out. The fast convergence of the proposal is proven.Keywords: P&O, Taylor’s series, MPPT, photovoltaic panel
Procedia PDF Downloads 5851603 Improved Imaging and Tracking Algorithm for Maneuvering Extended UAVs Using High-Resolution ISAR Radar System
Authors: Mohamed Barbary, Mohamed H. Abd El-Azeem
Abstract:
Maneuvering extended object tracking (M-EOT) using high-resolution inverse synthetic aperture radar (ISAR) observations has been gaining momentum recently. This work presents a new robust implementation of the multiple models (MM) multi-Bernoulli (MB) filter for M-EOT, where the M-EOT’s ISAR observations are characterized using a skewed (SK) non-symmetrically normal distribution. To cope with the possible abrupt change of kinematic state, extension, and observation distribution over an extended object when a target maneuvers, a multiple model technique is represented based on MB-track-before-detect (TBD) filter supported by SK-sub-random matrix model (RMM) or sub-ellipses framework. Simulation results demonstrate this remarkable impact.Keywords: maneuvering extended objects, ISAR, skewed normal distribution, sub-RMM, MM-MB-TBD filter
Procedia PDF Downloads 741602 Towards a Resources Provisioning for Dynamic Workflows in the Cloud
Authors: Fairouz Fakhfakh, Hatem Hadj Kacem, Ahmed Hadj Kacem
Abstract:
Cloud computing offers a new model of service provisioning for workflow applications, thanks to its elasticity and its paying model. However, it presents various challenges that need to be addressed in order to be efficiently utilized. The resources provisioning problem for workflow applications has been widely studied. Nevertheless, the existing works did not consider the change in workflow instances while they are being executed. This functionality has become a major requirement to deal with unusual situations and evolution. This paper presents a first step towards the resources provisioning for a dynamic workflow. In fact, we propose a provisioning algorithm which minimizes the overall workflow execution cost, while meeting a deadline constraint. Then, we extend it to support the dynamic adding of tasks. Experimental results show that our proposed heuristic demonstrates a significant reduction in resources cost by using a consolidation process.Keywords: cloud computing, resources provisioning, dynamic workflow, workflow applications
Procedia PDF Downloads 2931601 Temperature Contour Detection of Salt Ice Using Color Thermal Image Segmentation Method
Authors: Azam Fazelpour, Saeed Reza Dehghani, Vlastimil Masek, Yuri S. Muzychka
Abstract:
The study uses a novel image analysis based on thermal imaging to detect temperature contours created on salt ice surface during transient phenomena. Thermal cameras detect objects by using their emissivities and IR radiance. The ice surface temperature is not uniform during transient processes. The temperature starts to increase from the boundary of ice towards the center of that. Thermal cameras are able to report temperature changes on the ice surface at every individual moment. Various contours, which show different temperature areas, appear on the ice surface picture captured by a thermal camera. Identifying the exact boundary of these contours is valuable to facilitate ice surface temperature analysis. Image processing techniques are used to extract each contour area precisely. In this study, several pictures are recorded while the temperature is increasing throughout the ice surface. Some pictures are selected to be processed by a specific time interval. An image segmentation method is applied to images to determine the contour areas. Color thermal images are used to exploit the main information. Red, green and blue elements of color images are investigated to find the best contour boundaries. The algorithms of image enhancement and noise removal are applied to images to obtain a high contrast and clear image. A novel edge detection algorithm based on differences in the color of the pixels is established to determine contour boundaries. In this method, the edges of the contours are obtained according to properties of red, blue and green image elements. The color image elements are assessed considering their information. Useful elements proceed to process and useless elements are removed from the process to reduce the consuming time. Neighbor pixels with close intensities are assigned in one contour and differences in intensities determine boundaries. The results are then verified by conducting experimental tests. An experimental setup is performed using ice samples and a thermal camera. To observe the created ice contour by the thermal camera, the samples, which are initially at -20° C, are contacted with a warmer surface. Pictures are captured for 20 seconds. The method is applied to five images ,which are captured at the time intervals of 5 seconds. The study shows the green image element carries no useful information; therefore, the boundary detection method is applied on red and blue image elements. In this case study, the results indicate that proposed algorithm shows the boundaries more effective than other edges detection methods such as Sobel and Canny. Comparison between the contour detection in this method and temperature analysis, which states real boundaries, shows a good agreement. This color image edge detection method is applicable to other similar cases according to their image properties.Keywords: color image processing, edge detection, ice contour boundary, salt ice, thermal image
Procedia PDF Downloads 3121600 A New Floating Point Implementation of Base 2 Logarithm
Authors: Ahmed M. Mansour, Ali M. El-Sawy, Ahmed T. Sayed
Abstract:
Logarithms reduce products to sums and powers to products; they play an important role in signal processing, communication and information theory. They are primarily used for hardware calculations, handling multiplications, divisions, powers, and roots effectively. There are three commonly used bases for logarithms; the logarithm with base-10 is called the common logarithm, the natural logarithm with base-e and the binary logarithm with base-2. This paper demonstrates different methods of calculation for log2 showing the complexity of each and finds out the most accurate and efficient besides giving in- sights to their hardware design. We present a new method called Floor Shift for fast calculation of log2, and then we combine this algorithm with Taylor series to improve the accuracy of the output, we illustrate that by using two examples. We finally compare the algorithms and conclude with our remarks.Keywords: logarithms, log2, floor, iterative, CORDIC, Taylor series
Procedia PDF Downloads 5301599 Mobile Application Tool for Individual Maintenance Users on High-Rise Residential Buildings in South Korea
Authors: H. Cha, J. Kim, D. Kim, J. Shin, K. Lee
Abstract:
Since 1980's, the rapid economic growth resulted in so many aged apartment buildings in South Korea. Nevertheless, there is insufficient maintenance practice of buildings. In this study, to facilitate the building maintenance the authors classified the building defects into three levels according to their level of performance and developed a mobile application tool based on each level's appropriate feedback. The feedback structure consisted of 'Maintenance manual phase', 'Online feedback phase', 'Repair work phase of the specialty contractors'. In order to implement each phase the authors devised the necessary database for each phase and created a prototype system that can develop on its own. The authors expect that the building users can easily maintain their buildings by using this application.Keywords: building defect, maintenance practice, mobile application, system algorithm
Procedia PDF Downloads 1871598 Method of Cluster Based Cross-Domain Knowledge Acquisition for Biologically Inspired Design
Authors: Shen Jian, Hu Jie, Ma Jin, Peng Ying Hong, Fang Yi, Liu Wen Hai
Abstract:
Biologically inspired design inspires inventions and new technologies in the field of engineering by mimicking functions, principles, and structures in the biological domain. To deal with the obstacles of cross-domain knowledge acquisition in the existing biologically inspired design process, functional semantic clustering based on functional feature semantic correlation and environmental constraint clustering composition based on environmental characteristic constraining adaptability are proposed. A knowledge cell clustering algorithm and the corresponding prototype system is developed. Finally, the effectiveness of the method is verified by the visual prosthetic device design.Keywords: knowledge clustering, knowledge acquisition, knowledge based engineering, knowledge cell, biologically inspired design
Procedia PDF Downloads 4251597 Solving Process Planning and Scheduling with Number of Operation Plus Processing Time Due-Date Assignment Concurrently Using a Genetic Search
Authors: Halil Ibrahim Demir, Alper Goksu, Onur Canpolat, Caner Erden, Melek Nur
Abstract:
Traditionally process planning, scheduling and due date assignment are performed sequentially and separately. High interrelation between these functions makes integration very useful. Although there are numerous works on integrated process planning and scheduling and many works on scheduling with due date assignment, there are only a few works on the integration of these three functions. Here we tested the different integration levels of these three functions and found a fully integrated version as the best. We applied genetic search and random search and genetic search was found better compared to the random search. We penalized all earliness, tardiness and due date related costs. Since all these three terms are all undesired, it is better to penalize all of them.Keywords: process planning, scheduling, due-date assignment, genetic algorithm, random search
Procedia PDF Downloads 3731596 How to Talk about It without Talking about It: Cognitive Processing Therapy Offers Trauma Symptom Relief without Violating Cultural Norms
Authors: Anne Giles
Abstract:
Humans naturally wish they could forget traumatic experiences. To help prevent future harm, however, the human brain has evolved to retain data about experiences of threat, alarm, or violation. When given compassionate support and assistance with thinking helpfully and realistically about traumatic events, most people can adjust to experiencing hardships, albeit with residual sad, unfortunate memories. Persistent, recurrent, intrusive memories, difficulty sleeping, emotion dysregulation, and avoidance of reminders, however, may be symptoms of Post-traumatic Stress Disorder (PTSD). Brain scans show that PTSD affects brain functioning. We currently have no physical means of restoring the system of brain structures and functions involved with PTSD. Medications may ease some symptoms but not others. However, forms of "talk therapy" with cognitive components have been found by researchers to reduce, even resolve, a broad spectrum of trauma symptoms. Many cultures have taboos against talking about hardships. Individuals may present themselves to mental health care professionals with severe, disabling trauma symptoms but, because of cultural norms, be unable to speak about them. In China, for example, relationship expectations may include the belief, "Bad things happening in the family should stay in the family (jiāchǒu bùkě wàiyán 家丑不可外扬)." The concept of "family (jiā 家)" may include partnerships, close and extended families, communities, companies, and the nation itself. In contrast to many trauma therapies, Cognitive Processing Therapy (CPT) for Post-traumatic Stress Disorder asks its participants to focus not on "what" happened but on "why" they think the trauma(s) occurred. The question "why" activates and exercises cognitive functioning. Brain scans of individuals with PTSD reveal executive functioning portions of the brain inadequately active, with emotion centers overly active. CPT conceptualizes PTSD as a network of cognitive distortions that keep an individual "stuck" in this under-functioning and over-functioning dynamic. Through asking participants forms of the question "why," plus offering a protocol for examining answers and relinquishing unhelpful beliefs, CPT assists individuals in consciously reactivating the cognitive, executive functions of their brains, thus restoring normal functioning and reducing distressing trauma symptoms. The culturally sensitive components of CPT that allow people to "talk about it without talking about it" may offer the possibility for worldwide relief from symptoms of trauma.Keywords: cognitive processing therapy (CPT), cultural norms, post-traumatic stress disorder (PTSD), trauma recovery
Procedia PDF Downloads 2121595 High Capacity Reversible Watermarking through Interpolated Error Shifting
Authors: Hae-Yeoun Lee
Abstract:
Reversible watermarking that not only protects the copyright but also preserve the original quality of the digital content have been intensively studied. In particular, the demand for reversible watermarking has increased. In this paper, we propose a reversible watermarking scheme based on interpolation-error shifting and error precompensation. The intensity of a pixel is interpolated from the intensities of neighbouring pixels, and the difference histogram between the interpolated and the original intensities is obtained and modified to embed the watermark message. By restoring the difference histogram, the embedded watermark is extracted and the original image is recovered by compensating for the interpolation error. The overflow and underflow are prevented by error precompensation. To show the performance of the method, the proposed algorithm is compared with other methods using various test images.Keywords: reversible watermarking, high capacity, high quality, interpolated error shifting, error precompensation
Procedia PDF Downloads 3191594 Assessing Suitability and Acceptability of Development Plans and Town Planning Scheme in Small and Medium Town: A Case of Gujarat
Authors: Priyanshu Sharma
Abstract:
Urban development mechanism has evolved over the years in India, and various planning models and tools have been adopted by different states. Large cities have been able to make and implement plans with the varied degree. However, it has been observed these mechanisms face challenges to gain the momentum in small and medium towns. Gujarat has a very robust legislation that empowers planning authorities to prepare development plans (DP) and town planning scheme (TPS). The DP- TPS planning methods are quite popular for large cities in Gujarat. However, it has been observed that in the smaller towns these methods of plan preparation are facing severe agitations. Recently, development authorities of many small towns like Himmatnagar, Nadiad, and Junagadh, etc. have faced serious protest from local residents. This is because of the large amount of land deduction under the provisions of DP and TPS. And this number of opposition has been increasing since 2012 in Gujarat. This study aims to understand in detail the reasons for agitation against the plans prepared by smaller towns. It will further try to see whether the current framework of urban planning (DP and TPS) are really suitable for these towns. After understanding the development concerns and background, the aim and objectives of the study were outlined: Aim: To evaluate the suitability and acceptability of the current urban development mechanism for the small and medium towns. Objectives: (i) To review the GTPUD Act and identify the provision related to small and medium towns (ii) To understand preparation process of development plan and town planning scheme and issues related to it (iii) To understand the issues raised by the different stakeholder w.r.t plan because of which the plan and authority was agitated (iv) To find out the possible option through which these plans can be made suitable and acceptable to the stakeholder. The approach of this study is more qualitative based with the intention to understand the time frame process of preparation of development plan and town planning scheme and issues related to it. On the basis of literature study, the three towns were selected, and the detailed questionnaire was prepared for the stakeholders (development authorities and local residents) which include the time process taken in the preparation of DP and TPS and what were issues faced during the process and who all were involved. Lastly, the study looks into aspects of the land value of original plots and readjusted plots by concluding the argument whether this TP scheme model really worked in small and medium towns. Because the land deduction under TP scheme is allowed up to 50% as per the act and there is no distinct provision for small and medium towns under the act, so how this could be justified to smaller towns where the market value have not changed over the years. After analyzing the issues and reason behind the agitation against the DP and TPS in these small and medium towns. The broader recommendation has been given which can make these plans acceptable and suitable for the stakeholder.Keywords: development plans, medium towns, small towns, town planning schemes
Procedia PDF Downloads 1551593 Deployment of Matrix Transpose in Digital Image Encryption
Authors: Okike Benjamin, Garba E J. D.
Abstract:
Encryption is used to conceal information from prying eyes. Presently, information and data encryption are common due to the volume of data and information in transit across the globe on daily basis. Image encryption is yet to receive the attention of the researchers as deserved. In other words, video and multimedia documents are exposed to unauthorized accessors. The authors propose image encryption using matrix transpose. An algorithm that would allow image encryption is developed. In this proposed image encryption technique, the image to be encrypted is split into parts based on the image size. Each part is encrypted separately using matrix transpose. The actual encryption is on the picture elements (pixel) that make up the image. After encrypting each part of the image, the positions of the encrypted images are swapped before transmission of the image can take place. Swapping the positions of the images is carried out to make the encrypted image more robust for any cryptanalyst to decrypt.Keywords: image encryption, matrices, pixel, matrix transpose
Procedia PDF Downloads 4201592 Optimal Maintenance and Improvement Policies in Water Distribution System: Markov Decision Process Approach
Authors: Jong Woo Kim, Go Bong Choi, Sang Hwan Son, Dae Shik Kim, Jung Chul Suh, Jong Min Lee
Abstract:
The Markov Decision Process (MDP) based methodology is implemented in order to establish the optimal schedule which minimizes the cost. Formulation of MDP problem is presented using the information about the current state of pipe, improvement cost, failure cost and pipe deterioration model. The objective function and detailed algorithm of dynamic programming (DP) are modified due to the difficulty of implementing the conventional DP approaches. The optimal schedule derived from suggested model is compared to several policies via Monte Carlo simulation. Validity of the solution and improvement in computational time are proved.Keywords: Markov decision processes, dynamic programming, Monte Carlo simulation, periodic replacement, Weibull distribution
Procedia PDF Downloads 4211591 Aerobic Bioprocess Control Using Artificial Intelligence Techniques
Authors: M. Caramihai, Irina Severin
Abstract:
This paper deals with the design of an intelligent control structure for a bioprocess of Hansenula polymorpha yeast cultivation. The objective of the process control is to produce biomass in a desired physiological state. The work demonstrates that the designed Hybrid Control Techniques (HCT) are able to recognize specific evolution bioprocess trajectories using neural networks trained specifically for this purpose, in order to estimate the model parameters and to adjust the overall bioprocess evolution through an expert system and a fuzzy structure. The design of the control algorithm as well as its tuning through realistic simulations is presented. Taking into consideration the synergism of different paradigms like fuzzy logic, neural network, and symbolic artificial intelligence (AI), in this paper we present a real and fulfilled intelligent control architecture with application in bioprocess control.Keywords: bioprocess, intelligent control, neural nets, fuzzy structure, hybrid techniques
Procedia PDF Downloads 4191590 Channel Estimation for LTE Downlink
Authors: Rashi Jain
Abstract:
The LTE systems employ Orthogonal Frequency Division Multiplexing (OFDM) as the multiple access technology for the Downlink channels. For enhanced performance, accurate channel estimation is required. Various algorithms such as Least Squares (LS), Minimum Mean Square Error (MMSE) and Recursive Least Squares (RLS) can be employed for the purpose. The paper proposes channel estimation algorithm based on Kalman Filter for LTE-Downlink system. Using the frequency domain pilots, the initial channel response is obtained using the LS criterion. Then Kalman Filter is employed to track the channel variations in time-domain. To suppress the noise within a symbol, threshold processing is employed. The paper draws comparison between the LS, MMSE, RLS and Kalman filter for channel estimation. The parameters for evaluation are Bit Error Rate (BER), Mean Square Error (MSE) and run-time.Keywords: LTE, channel estimation, OFDM, RLS, Kalman filter, threshold
Procedia PDF Downloads 3531589 Black-Box-Base Generic Perturbation Generation Method under Salient Graphs
Authors: Dingyang Hu, Dan Liu
Abstract:
DNN (Deep Neural Network) deep learning models are widely used in classification, prediction, and other task scenarios. To address the difficulties of generic adversarial perturbation generation for deep learning models under black-box conditions, a generic adversarial ingestion generation method based on a saliency map (CJsp) is proposed to obtain salient image regions by counting the factors that influence the input features of an image on the output results. This method can be understood as a saliency map attack algorithm to obtain false classification results by reducing the weights of salient feature points. Experiments also demonstrate that this method can obtain a high success rate of migration attacks and is a batch adversarial sample generation method.Keywords: adversarial sample, gradient, probability, black box
Procedia PDF Downloads 1021588 Solving Optimal Control of Semilinear Elliptic Variational Inequalities Obstacle Problems using Smoothing Functions
Authors: El Hassene Osmani, Mounir Haddou, Naceurdine Bensalem
Abstract:
In this paper, we investigate optimal control problems governed by semilinear elliptic variational inequalities involving constraints on the state, and more precisely, the obstacle problem. We present a relaxed formulation for the problem using smoothing functions. Since we adopt a numerical point of view, we first relax the feasible domain of the problem, then using both mathematical programming methods and penalization methods, we get optimality conditions with smooth Lagrange multipliers. Some numerical experiments using IPOPT algorithm (Interior Point Optimizer) are presented to verify the efficiency of our approach.Keywords: complementarity problem, IPOPT, Lagrange multipliers, mathematical programming, optimal control, smoothing methods, variationally inequalities
Procedia PDF Downloads 1711587 Digital Cinema Watermarking State of Art and Comparison
Authors: H. Kelkoul, Y. Zaz
Abstract:
Nowadays, the vigorous popularity of video processing techniques has resulted in an explosive growth of multimedia data illegal use. So, watermarking security has received much more attention. The purpose of this paper is to explore some watermarking techniques in order to observe their specificities and select the finest methods to apply in digital cinema domain against movie piracy by creating an invisible watermark that includes the date, time and the place where the hacking was done. We have studied three principal watermarking techniques in the frequency domain: Spread spectrum, Wavelet transform domain and finally the digital cinema watermarking transform domain. In this paper, a detailed technique is presented where embedding is performed using direct sequence spread spectrum technique in DWT transform domain. Experiment results shows that the algorithm provides high robustness and good imperceptibility.Keywords: digital cinema, watermarking, wavelet DWT, spread spectrum, JPEG2000 MPEG4
Procedia PDF Downloads 2501586 Bayesian Structural Identification with Systematic Uncertainty Using Multiple Responses
Authors: André Jesus, Yanjie Zhu, Irwanda Laory
Abstract:
Structural health monitoring is one of the most promising technologies concerning aversion of structural risk and economic savings. Analysts often have to deal with a considerable variety of uncertainties that arise during a monitoring process. Namely the widespread application of numerical models (model-based) is accompanied by a widespread concern about quantifying the uncertainties prevailing in their use. Some of these uncertainties are related with the deterministic nature of the model (code uncertainty) others with the variability of its inputs (parameter uncertainty) and the discrepancy between a model/experiment (systematic uncertainty). The actual process always exhibits a random behaviour (observation error) even when conditions are set identically (residual variation). Bayesian inference assumes that parameters of a model are random variables with an associated PDF, which can be inferred from experimental data. However in many Bayesian methods the determination of systematic uncertainty can be problematic. In this work systematic uncertainty is associated with a discrepancy function. The numerical model and discrepancy function are approximated by Gaussian processes (surrogate model). Finally, to avoid the computational burden of a fully Bayesian approach the parameters that characterise the Gaussian processes were estimated in a four stage process (modular Bayesian approach). The proposed methodology has been successfully applied on fields such as geoscience, biomedics, particle physics but never on the SHM context. This approach considerably reduces the computational burden; although the extent of the considered uncertainties is lower (second order effects are neglected). To successfully identify the considered uncertainties this formulation was extended to consider multiple responses. The efficiency of the algorithm has been tested on a small scale aluminium bridge structure, subjected to a thermal expansion due to infrared heaters. Comparison of its performance with responses measured at different points of the structure and associated degrees of identifiability is also carried out. A numerical FEM model of the structure was developed and the stiffness from its supports is considered as a parameter to calibrate. Results show that the modular Bayesian approach performed best when responses of the same type had the lowest spatial correlation. Based on previous literature, using different types of responses (strain, acceleration, and displacement) should also improve the identifiability problem. Uncertainties due to parametric variability, observation error, residual variability, code variability and systematic uncertainty were all recovered. For this example the algorithm performance was stable and considerably quicker than Bayesian methods that account for the full extent of uncertainties. Future research with real-life examples is required to fully access the advantages and limitations of the proposed methodology.Keywords: bayesian, calibration, numerical model, system identification, systematic uncertainty, Gaussian process
Procedia PDF Downloads 3241585 Road Vehicle Recognition Using Magnetic Sensing Feature Extraction and Classification
Authors: Xiao Chen, Xiaoying Kong, Min Xu
Abstract:
This paper presents a road vehicle detection approach for the intelligent transportation system. This approach mainly uses low-cost magnetic sensor and associated data collection system to collect magnetic signals. This system can measure the magnetic field changing, and it also can detect and count vehicles. We extend Mel Frequency Cepstral Coefficients to analyze vehicle magnetic signals. Vehicle type features are extracted using representation of cepstrum, frame energy, and gap cepstrum of magnetic signals. We design a 2-dimensional map algorithm using Vector Quantization to classify vehicle magnetic features to four typical types of vehicles in Australian suburbs: sedan, VAN, truck, and bus. Experiments results show that our approach achieves a high level of accuracy for vehicle detection and classification.Keywords: vehicle classification, signal processing, road traffic model, magnetic sensing
Procedia PDF Downloads 3181584 Research on ARQ Transmission Technique in Mars Detection Telecommunications System
Authors: Zhongfei Cai, Hui He, Changsheng Li
Abstract:
This paper studied in the automatic repeat request (ARQ) transmission technique in Mars detection telecommunications system. An ARQ method applied to proximity-1 space link protocol was proposed by this paper. In order to ensure the efficiency of data reliable transmission, this ARQ method combined these different ARQ maneuvers characteristics. Considering the Mars detection communication environments, this paper analyzed the characteristics of the saturation throughput rate, packet dropping probability, average delay and energy efficiency with different ARQ algorithms. Combined thus results with the theories of ARQ transmission technique, an ARQ transmission project in Mars detection telecommunications system was established. The simulation results showed that this algorithm had excellent saturation throughput rate and energy efficiency with low complexity.Keywords: ARQ, mars, CCSDS, proximity-1, deepspace
Procedia PDF Downloads 3381583 A Scoping Review of the Relationship Between Oral Health and Wellbeing: The Myth and Reality
Authors: Heba Salama, Barry Gibson, Jennifer Burr
Abstract:
Introduction: It is often argued that better oral health leads to better wellbeing, and the goal of dental care is to improve wellbeing. Notwithstanding, to our best knowledge, there is a lack of evidence to support the relationship between oral health and wellbeing. Aim: The scoping review aims to examine current definitions of health and wellbeing as well as map the evidence to examine the relationship between oral health and wellbeing. Methods: The scoping review followed the Preferred Reporting Items for Systematic Reviews Extension for Scoping Review (PRISMA-ScR). A two-phase search strategy was followed because of the unmanageable number of hits returned. The first phase was to identify how well-being was conceptualised in oral health literacy, and the second phase was to search for extracted keywords. The extracted keywords were searched in four databases: PubMed, CINAHL, PsycINFO, and Web of Science. To limit the number of studies to a manageable amount, the search was limited to the open-access studies that have been published in the last five years (from 2018 to 2022). Results: Only eight studies (0.1%) of the 5455 results met the review inclusion criteria. Most of the included studies defined wellbeing based on the hedonic theory. And the Satisfaction with Life Scale is the most used. Although the research results are inconsistent, it has generally been shown that there is a weak or no association between oral health and wellbeing. Interpretation: The review revealed a very important point about how oral health literature uses loose definitions that have significant implications for empirical research. That results in misleading evidence-based conclusions. According to the review results, improving oral health is not a key factor in improving wellbeing. It appears that investing in oral health care to improve wellbeing is not a top priority to tell policymakers about. This does not imply that there should be no investment in oral health care to improve oral health. That could have an indirect link to wellbeing by eliminating the potential oral health-related barriers to quality of life that could represent the foundation of wellbeing. Limitation: Only the most recent five years (2018–2022), peer-reviewed English-language literature, and four electronic databases were included in the search. These restrictions were put in place to keep the volume of literature at a manageable level. This suggests that some significant studies might have been omitted. Furthermore, the study used a definition of wellbeing that is currently being evolved and might not everyone agrees with it. Conclusion: Whilst it is a ubiquitous argument that oral health is related to wellbeing, and this seems logical, there is little empirical evidence to support this claim. This question, therefore, requires much more detailed consideration. Funding: This project was funded by the Ministry of Higher Education and Scientific Research in Libya and Tripoli University.Keywords: oral health, wellbeing, satisfaction, emotion, quality of life, oral health related quality of life
Procedia PDF Downloads 1181582 Density-based Denoising of Point Cloud
Authors: Faisal Zaman, Ya Ping Wong, Boon Yian Ng
Abstract:
Point cloud source data for surface reconstruction is usually contaminated with noise and outliers. To overcome this, we present a novel approach using modified kernel density estimation (KDE) technique with bilateral filtering to remove noisy points and outliers. First we present a method for estimating optimal bandwidth of multivariate KDE using particle swarm optimization technique which ensures the robust performance of density estimation. Then we use mean-shift algorithm to find the local maxima of the density estimation which gives the centroid of the clusters. Then we compute the distance of a certain point from the centroid. Points belong to outliers then removed by automatic thresholding scheme which yields an accurate and economical point surface. The experimental results show that our approach comparably robust and efficient.Keywords: point preprocessing, outlier removal, surface reconstruction, kernel density estimation
Procedia PDF Downloads 3441581 New Approaches for the Handwritten Digit Image Features Extraction for Recognition
Authors: U. Ravi Babu, Mohd Mastan
Abstract:
The present paper proposes a novel approach for handwritten digit recognition system. The present paper extract digit image features based on distance measure and derives an algorithm to classify the digit images. The distance measure can be performing on the thinned image. Thinning is the one of the preprocessing technique in image processing. The present paper mainly concentrated on an extraction of features from digit image for effective recognition of the numeral. To find the effectiveness of the proposed method tested on MNIST database, CENPARMI, CEDAR, and newly collected data. The proposed method is implemented on more than one lakh digit images and it gets good comparative recognition results. The percentage of the recognition is achieved about 97.32%.Keywords: handwritten digit recognition, distance measure, MNIST database, image features
Procedia PDF Downloads 4601580 Image Compression Using Block Power Method for SVD Decomposition
Authors: El Asnaoui Khalid, Chawki Youness, Aksasse Brahim, Ouanan Mohammed
Abstract:
In these recent decades, the important and fast growth in the development and demand of multimedia products is contributing to an insufficient in the bandwidth of device and network storage memory. Consequently, the theory of data compression becomes more significant for reducing the data redundancy in order to save more transfer and storage of data. In this context, this paper addresses the problem of the lossless and the near-lossless compression of images. This proposed method is based on Block SVD Power Method that overcomes the disadvantages of Matlab's SVD function. The experimental results show that the proposed algorithm has a better compression performance compared with the existing compression algorithms that use the Matlab's SVD function. In addition, the proposed approach is simple and can provide different degrees of error resilience, which gives, in a short execution time, a better image compression.Keywords: image compression, SVD, block SVD power method, lossless compression, near lossless
Procedia PDF Downloads 384