Search results for: inverse Laplace transform techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3420

Search results for: inverse Laplace transform techniques

2760 Better Perception of Low Resolution Images Using Wavelet Interpolation Techniques

Authors: Tarun Gulati, Kapil Gupta, Dushyant Gupta

Abstract:

High resolution images are always desired as they contain the more information and they can better represent the original data. So, to convert the low resolution image into high resolution interpolation is done. The quality of such high resolution image depends on the interpolation function and is assessed in terms of sharpness of image. This paper focuses on Wavelet based Interpolation Techniques in which an input image is divided into subbands. Each subband is processed separately and finally combined the processed subbandsto get the super resolution image. 

Keywords: SWT, DWTSR, DWTSWT, DWCWT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2172
2759 Identification of Promising Infant Clusters to Obtain Improved Block Layout Designs

Authors: Mustahsan Mir, Ahmed Hassanin, Mohammed A. Al-Saleh

Abstract:

The layout optimization of building blocks of unequal areas has applications in many disciplines including VLSI floorplanning, macrocell placement, unequal-area facilities layout optimization, and plant or machine layout design. A number of heuristics and some analytical and hybrid techniques have been published to solve this problem. This paper presents an efficient high-quality building-block layout design technique especially suited for solving large-size problems. The higher efficiency and improved quality of optimized solutions are made possible by introducing the concept of Promising Infant Clusters in a constructive placement procedure. The results presented in the paper demonstrate the improved performance of the presented technique for benchmark problems in comparison with published heuristic, analytic, and hybrid techniques.

Keywords: Block layout problem, building-block layout design, CAD, optimization, search techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1242
2758 Synthesis of Y2O3 Films by Spray Coating with Milled EDTA·Y·H Complexes

Authors: Keiji Komatsu, Tetsuo Sekiya, Ayumu Toyama, Atsushi Nakamura, Ikumi Toda, Shigeo Ohshio, Hiroyuki Muramatsu, Hidetoshi Saitoh, Atsushi Nakamura, Ariyuki Kato

Abstract:

Yttrium oxide (Y2O3) films have been successfully deposited with yttrium-ethylenediamine tetraacetic acid (EDTA·Y·H) complexes prepared by various milling techniques. The effects of the properties of the EDTA·Y·H complex on the properties of the deposited Y2O3 films have been analyzed. Seven different types of the raw EDTA·Y·H complexes were prepared by various commercial milling techniques such as ball milling, hammer milling, commercial milling, and mortar milling. The milled EDTA·Y·H complexes exhibited various particle sizes and distributions, depending on the milling method. Furthermore, we analyzed the crystal structure, morphology and elemental distribution profile of the metal oxide films deposited on stainless steel substrate with the milled EDTA·Y·H complexes. Depending on the milling technique, the flow properties of the raw powders differed. The X-ray diffraction pattern of all the samples revealed the formation of Y2O3 crystalline phase, irrespective of the milling technique. Of all the different milling techniques, the hammer milling technique is considered suitable for fabricating dense Y2O3 films.

Keywords: Powder sizes and distributions, Flame spray coating techniques, Yttrium oxide.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2628
2757 A Perceptually Optimized Wavelet Embedded Zero Tree Image Coder

Authors: A. Bajit, M. Nahid, A. Tamtaoui, E. H. Bouyakhf

Abstract:

In this paper, we propose a Perceptually Optimized Embedded ZeroTree Image Coder (POEZIC) that introduces a perceptual weighting to wavelet transform coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to the coding quality obtained using the SPIHT algorithm only. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEZIC quality assessment. Our POEZIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) luminance masking and Contrast masking, 2) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting, 3) the Wavelet Error Sensitivity WES used to reduce the perceptual quantization errors. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.

Keywords: DWT, linear-phase 9/7 filter, 9/7 Wavelets Error Sensitivity WES, CSF implementation approaches, JND Just Noticeable Difference, Luminance masking, Contrast masking, standard SPIHT, Objective Quality Measure, Probability Score PS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2051
2756 Analysis of S.P.O Techniques for Prediction of Dynamic Behavior of the Plate

Authors: Byung-kyoo Jung, Weui-bong Jeong

Abstract:

In most cases, it is considerably difficult to directly measure structural vibration with a lot of sensors because of complex geometry, time and equipment cost. For this reason, this paper deals with the problem of locating sensors on a plate model by four advanced sensor placement optimization (S.P.O) techniques. It also suggests the evaluation index representing the characteristic of orthogonal between each of natural modes. The index value provides the assistance to selecting of proper S.P.O technique and optimal positions for monitoring of dynamic systems without the experiment.

Keywords: Genetic algorithm, Modal assurance criterion, Sensor placement optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1679
2755 Artificial Neural Network Development by means of Genetic Programming with Graph Codification

Authors: Daniel Rivero, Julián Dorado, Juan R. Rabuñal, Alejandro Pazos, Javier Pereira

Abstract:

The development of Artificial Neural Networks (ANNs) is usually a slow process in which the human expert has to test several architectures until he finds the one that achieves best results to solve a certain problem. This work presents a new technique that uses Genetic Programming (GP) for automatically generating ANNs. To do this, the GP algorithm had to be changed in order to work with graph structures, so ANNs can be developed. This technique also allows the obtaining of simplified networks that solve the problem with a small group of neurons. In order to measure the performance of the system and to compare the results with other ANN development methods by means of Evolutionary Computation (EC) techniques, several tests were performed with problems based on some of the most used test databases. The results of those comparisons show that the system achieves good results comparable with the already existing techniques and, in most of the cases, they worked better than those techniques.

Keywords: Artificial Neural Networks, Evolutionary Computation, Genetic Programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1461
2754 Rapid Processing Techniques Applied to Sintered Nickel Battery Technologies for Utility Scale Applications

Authors: J. D. Marinaccio, I. Mabbett, C. Glover, D. Worsley

Abstract:

Through use of novel modern/rapid processing techniques such as screen printing and Near-Infrared (NIR) radiative curing, process time for the sintering of sintered nickel plaques, applicable to alkaline nickel battery chemistries, has been drastically reduced from in excess of 200 minutes with conventional convection methods to below 2 minutes using NIR curing methods. Steps have also been taken to remove the need for forming gas as a reducing agent by implementing carbon as an in-situ reducing agent, within the ink formulation.

Keywords: Batteries, energy, iron, nickel, storage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2341
2753 Predictive Modelling Techniques in Sediment Yield and Hydrological Modelling

Authors: Adesoji T. Jaiyeola, Josiah Adeyemo

Abstract:

This paper presents an extensive review of literature relevant to the modelling techniques adopted in sediment yield and hydrological modelling. Several studies relating to sediment yield are discussed. Many research areas of sedimentation in rivers, runoff and reservoirs are presented. Different types of hydrological models, different methods employed in selecting appropriate models for different case studies are analysed. Applications of evolutionary algorithms and artificial intelligence techniques are discussed and compared especially in water resources management and modelling. This review concentrates on Genetic Programming (GP) and fully discusses its theories and applications. The successful applications of GP as a soft computing technique were reviewed in sediment modelling. Some fundamental issues such as benchmark, generalization ability, bloat, over-fitting and other open issues relating to the working principles of GP are highlighted. This paper concludes with the identification of some research gaps in hydrological modelling and sediment yield.

Keywords: Artificial intelligence, evolutionary algorithm, genetic programming, sediment yield.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1861
2752 Sensitivity Analysis of Real-Time Systems

Authors: Benjamin Gorry, Andrew Ireland, Peter King

Abstract:

Verification of real-time software systems can be expensive in terms of time and resources. Testing is the main method of proving correctness but has been shown to be a long and time consuming process. Everyday engineers are usually unwilling to adopt formal approaches to correctness because of the overhead associated with developing their knowledge of such techniques. Performance modelling techniques allow systems to be evaluated with respect to timing constraints. This paper describes PARTES, a framework which guides the extraction of performance models from programs written in an annotated subset of C.

Keywords: Performance Modelling, Real-time, SensitivityAnalysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1513
2751 PM10 Chemical Characteristics in a Background Site at the Universidad Libre Bogotá

Authors: Laura X. Martinez, Andrés F. Rodríguez, Ruth A. Catacoli

Abstract:

One of the most important factors for air pollution is that the concentrations of PM10 maintain a constant trend, with the exception of some places where that frequently surpasses the allowed ranges established by Colombian legislation. The community that surrounds the Universidad Libre Bogotá is inhabited by a considerable number of students and workers, all of whom are possibly being exposed to PM10 for long periods of time while on campus. Thus, the chemical characterization of PM10 found in the ambient air at the Universidad Libre Bogotá was identified as a problem. A Hi-Vol sampler and EPA Test Method 5 were used to determine if the quality of air is adequate for the human respiratory system. Additionally, quartz fiber filters were utilized during sampling. Samples were taken three days a week during a dry period throughout the months of November and December 2015. The gravimetric analysis method was used to determine PM10 concentrations. The chemical characterization includes non-conventional carcinogenic pollutants. Atomic absorption spectrophotometry (AAS) was used for the determination of metals and VOCs were analyzed using the FTIR (Fourier transform infrared spectroscopy) method. In this way, concentrations of PM10, ranging from values of 13 µg/m3 to 66 µg/m3, were obtained; these values were below standard conditions. This evidence concludes that the PM10 concentrations during an exposure period of 24 hours are lower than the values established by Colombian law, Resolution 610 of 2010; however, when comparing these with the limits set by the World Health Organization (WHO), these concentrations could possibly exceed permissible levels.

Keywords: Air quality, atomic absorption spectrophotometry, Fourier transform infrared spectroscopy, particulate matter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 914
2750 Detecting Cavitation in a Vertical Sea water Centrifugal Lift Pump Related to Iran Oil Industry Cooling Water Circulation System

Authors: Omid A. Zargar

Abstract:

Cavitation is one of the most well-known process faults that may occur in different industrial equipment especially centrifugal pumps. Cavitation also may happen in water pumps and turbines. Sometimes cavitation has been severe enough to wear holes in the impeller and damage the vanes to such a degree that the impeller becomes very ineffective. More commonly, the pump efficiency will decrease significantly during cavitation and continue to decrease as damage to the impeller increases. Typically, when cavitation occurs, an audible sound similar to ‘marbles’ or ‘crackling’ is reported to be emitted from the pump. In this paper, the most effective monitoring items and techniques in detecting cavitation discussed in details. Besides, some successful solutions for solving this problem for sea water vertical Centrifugal lift Pump discussed through a case history related to Iran oil industry. Furthermore, balance line modification, strainer choking and random resonance in sea water pumps discussed. In addition, a new Method for diagnosing mechanical conditions of sea water vertical Centrifugal lift Pumps introduced. This method involves disaggregating bus current by device into disaggregated currents having correspondences with operating currents in response to measured bus current. Moreover, some new patents and innovations in mechanical sea water pumping and cooling systems discussed in this paper.

Keywords: Cavitation, Vibration Analysis, Centrifugal Pump, Vertical Pump, Sea Water Pump, Balance Line, Strainer, Time Wave Form (TWF), Fast Fourier Transform (FFT)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4162
2749 Optimal Design of Selective Excitation Pulses in Magnetic Resonance Imaging using Genetic Algorithms

Authors: Mohammed A. Alolfe, Abou-Bakr M. Youssef, Yasser M. Kadah

Abstract:

The proper design of RF pulses in magnetic resonance imaging (MRI) has a direct impact on the quality of acquired images, and is needed for many applications. Several techniques have been proposed to obtain the RF pulse envelope given the desired slice profile. Unfortunately, these techniques do not take into account the limitations of practical implementation such as limited amplitude resolution. Moreover, implementing constraints for special RF pulses on most techniques is not possible. In this work, we propose to develop an approach for designing optimal RF pulses under theoretically any constraints. The new technique will pose the RF pulse design problem as a combinatorial optimization problem and uses efficient techniques from this area such as genetic algorithms (GA) to solve this problem. In particular, an objective function will be proposed as the norm of the difference between the desired profile and the one obtained from solving the Bloch equations for the current RF pulse design values. The proposed approach will be verified using analytical solution based RF simulations and compared to previous methods such as Shinnar-Le Roux (SLR) method, and analysis, selected, and tested the options and parameters that control the Genetic Algorithm (GA) can significantly affect its performance to get the best improved results and compared to previous works in this field. The results show a significant improvement over conventional design techniques, select the best options and parameters for GA to get most improvement over the previous works, and suggest the practicality of using of the new technique for most important applications as slice selection for large flip angles, in the area of unconventional spatial encoding, and another clinical use.

Keywords: Selective excitation, magnetic resonance imaging, combinatorial optimization, pulse design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1612
2748 Image Spam Detection Using Color Features and K-Nearest Neighbor Classification

Authors: T. Kumaresan, S. Sanjushree, C. Palanisamy

Abstract:

Image spam is a kind of email spam where the spam text is embedded with an image. It is a new spamming technique being used by spammers to send their messages to bulk of internet users. Spam email has become a big problem in the lives of internet users, causing time consumption and economic losses. The main objective of this paper is to detect the image spam by using histogram properties of an image. Though there are many techniques to automatically detect and avoid this problem, spammers employing new tricks to bypass those techniques, as a result those techniques are inefficient to detect the spam mails. In this paper we have proposed a new method to detect the image spam. Here the image features are extracted by using RGB histogram, HSV histogram and combination of both RGB and HSV histogram. Based on the optimized image feature set classification is done by using k- Nearest Neighbor(k-NN) algorithm. Experimental result shows that our method has achieved better accuracy. From the result it is known that combination of RGB and HSV histogram with k-NN algorithm gives the best accuracy in spam detection.

Keywords: File Type, HSV Histogram, k-NN, RGB Histogram, Spam Detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2142
2747 Evaluation Techniques of Photography in Visual Communications in Iran

Authors: Firouzeh Keshavarzi

Abstract:

Although a picture can be automatically a graphic work, but especially in the field of graphics and images based on the idea of advertising and graphic design will be prepared and photographers to realize the design using his own knowledge and skills to help does. It is evident that knowledge of photography, photographer and designer of the facilities, fields of reaching a higher level of quality offers. At the same time do not have a graphic designer is also skilled photographer, but can execute your idea may delegate to an expert photographer. Using technology and methods in all fields of photography, graphic art may be applicable. But most of its application in Iran, in works such as packaging, posters, Bill Board, advertising, brochures and catalogs are. In this study, we review how the images and techniques in the chart should be used in Iranian graphic photo what impact has left. Using photography techniques and procedures can be designed and helped advance the goals graphic. Technique could not determine the idea. But what is important to think about design and photography and his creativity can flourish as a tool to be effective graphic designer in mind. Computer software to help it's very promotes creativity techniques shall graphic designer but also it is as a tool. Using images in various fields, especially graphic arts and only because it is not being documented, but applications are beautiful. As to his photographic style from today is graphics. Graphic works try to affect impacts on their audience. Hence the photo as an important factor is attention. The other hand saw the man with the extent of forgiving and understanding people's image, instead of using the word to your files, allows large messages and concepts should be sent in the shortest time. Posters, advertisements, brochures, catalog and packaging products very diverse agricultural, industrial and food could not be self-image. Today, the use of graphic images for a big score and the photos to richen the role graphic design plays a major.

Keywords: Photo, Photography Techniques, Contacts, GraphicDesigner, Visual Communications, Iran.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2881
2746 H-ARQ Techniques for Wireless Systems with Punctured Non-Binary LDPC as FEC Code

Authors: Ł. Kiedrowski, H. Gierszal, W. Hołubowicz

Abstract:

This paper presents the H-ARQ techniques comparison for OFDM systems with a new family of non-binary LDPC codes which has been developed within the EU FP7 DAVINCI project. The punctured NB-LDPC codes have been used in a simulated model of the transmission system. The link level performance has been evaluated in terms of spectral efficiency, codeword error rate and average number of retransmissions. The NB-LDPC codes can be easily and effective implemented with different methods of the retransmission needed if correct decoding of a codeword failed. Here the Optimal Symbol Selection method is proposed as a Chase Combining technique.

Keywords: H-ARQ, LDPC, Non-Binary, Punctured Codes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1735
2745 Customer Need Type Classification Model using Data Mining Techniques for Recommender Systems

Authors: Kyoung-jae Kim

Abstract:

Recommender systems are usually regarded as an important marketing tool in the e-commerce. They use important information about users to facilitate accurate recommendation. The information includes user context such as location, time and interest for personalization of mobile users. We can easily collect information about location and time because mobile devices communicate with the base station of the service provider. However, information about user interest can-t be easily collected because user interest can not be captured automatically without user-s approval process. User interest usually represented as a need. In this study, we classify needs into two types according to prior research. This study investigates the usefulness of data mining techniques for classifying user need type for recommendation systems. We employ several data mining techniques including artificial neural networks, decision trees, case-based reasoning, and multivariate discriminant analysis. Experimental results show that CHAID algorithm outperforms other models for classifying user need type. This study performs McNemar test to examine the statistical significance of the differences of classification results. The results of McNemar test also show that CHAID performs better than the other models with statistical significance.

Keywords: Customer need type, Data mining techniques, Recommender system, Personalization, Mobile user.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2146
2744 Acute Coronary Syndrome Prediction Using Data Mining Techniques- An Application

Authors: Tahseen A. Jilani, Huda Yasin, Madiha Yasin, C. Ardil

Abstract:

In this paper we use data mining techniques to investigate factors that contribute significantly to enhancing the risk of acute coronary syndrome. We assume that the dependent variable is diagnosis – with dichotomous values showing presence or  absence of disease. We have applied binary regression to the factors affecting the dependent variable. The data set has been taken from two different cardiac hospitals of Karachi, Pakistan. We have total sixteen variables out of which one is assumed dependent and other 15 are independent variables. For better performance of the regression model in predicting acute coronary syndrome, data reduction techniques like principle component analysis is applied. Based on results of data reduction, we have considered only 14 out of sixteen factors.

Keywords: Acute coronary syndrome (ACS), binary logistic regression analyses, myocardial ischemia (MI), principle component analysis, unstable angina (U.A.).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2114
2743 Curvature Ductility Factor of Rectangular Sections Reinforced Concrete Beams

Authors: Y. Si Youcef, M. Chemrouk

Abstract:

The present work presents a method of calculating the ductility of rectangular sections of beams considering nonlinear behavior of concrete and steel. This calculation procedure allows us to trace the curvature of the section according to the bending moment, and consequently deduce ductility. It also allowed us to study the various parameters that affect the value of the ductility. A comparison of the effect of maximum rates of tension steel, adopted by the codes, ACI [1], EC8 [2] and RPA [3] on the value of the ductility was made. It was concluded that the maximum rate of steels permitted by the ACI [1] codes and RPA [3] are almost similar in their effect on the ductility and too high. Therefore, the ductility mobilized in case of an earthquake is low, the inverse of code EC8 [2]. Recommendations have been made in this direction.

Keywords: Ductility, beam, reinforced concrete, seismic code, relationship, time bending, resistance, non-linear behavior.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6194
2742 A Comparison of the Nonparametric Regression Models using Smoothing Spline and Kernel Regression

Authors: Dursun Aydin

Abstract:

This paper study about using of nonparametric models for Gross National Product data in Turkey and Stanford heart transplant data. It is discussed two nonparametric techniques called smoothing spline and kernel regression. The main goal is to compare the techniques used for prediction of the nonparametric regression models. According to the results of numerical studies, it is concluded that smoothing spline regression estimators are better than those of the kernel regression.

Keywords: Kernel regression, Nonparametric models, Prediction, Smoothing spline.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3101
2741 A Survey on Data-Centric and Data-Aware Techniques for Large Scale Infrastructures

Authors: Silvina Caíno-Lores, Jesús Carretero

Abstract:

Large scale computing infrastructures have been widely developed with the core objective of providing a suitable platform for high-performance and high-throughput computing. These systems are designed to support resource-intensive and complex applications, which can be found in many scientific and industrial areas. Currently, large scale data-intensive applications are hindered by the high latencies that result from the access to vastly distributed data. Recent works have suggested that improving data locality is key to move towards exascale infrastructures efficiently, as solutions to this problem aim to reduce the bandwidth consumed in data transfers, and the overheads that arise from them. There are several techniques that attempt to move computations closer to the data. In this survey we analyse the different mechanisms that have been proposed to provide data locality for large scale high-performance and high-throughput systems. This survey intends to assist scientific computing community in understanding the various technical aspects and strategies that have been reported in recent literature regarding data locality. As a result, we present an overview of locality-oriented techniques, which are grouped in four main categories: application development, task scheduling, in-memory computing and storage platforms. Finally, the authors include a discussion on future research lines and synergies among the former techniques.

Keywords: Co-scheduling, data-centric, data-intensive, data locality, in-memory storage, large scale.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1492
2740 Optimal Image Representation for Linear Canonical Transform Multiplexing

Authors: Navdeep Goel, Salvador Gabarda

Abstract:

Digital images are widely used in computer applications. To store or transmit the uncompressed images requires considerable storage capacity and transmission bandwidth. Image compression is a means to perform transmission or storage of visual data in the most economical way. This paper explains about how images can be encoded to be transmitted in a multiplexing time-frequency domain channel. Multiplexing involves packing signals together whose representations are compact in the working domain. In order to optimize transmission resources each 4 × 4 pixel block of the image is transformed by a suitable polynomial approximation, into a minimal number of coefficients. Less than 4 × 4 coefficients in one block spares a significant amount of transmitted information, but some information is lost. Different approximations for image transformation have been evaluated as polynomial representation (Vandermonde matrix), least squares + gradient descent, 1-D Chebyshev polynomials, 2-D Chebyshev polynomials or singular value decomposition (SVD). Results have been compared in terms of nominal compression rate (NCR), compression ratio (CR) and peak signal-to-noise ratio (PSNR) in order to minimize the error function defined as the difference between the original pixel gray levels and the approximated polynomial output. Polynomial coefficients have been later encoded and handled for generating chirps in a target rate of about two chirps per 4 × 4 pixel block and then submitted to a transmission multiplexing operation in the time-frequency domain.

Keywords: Chirp signals, Image multiplexing, Image transformation, Linear canonical transform, Polynomial approximation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2130
2739 State of the Art: A Study on Fall Detection

Authors: Goh Yongli, Ooi Shih Yin, Pang Ying Han

Abstract:

Unintentional falls are rife throughout the ages and have been the common factor of serious or critical injuries especially for the elderly society. Fortunately, owing to the recent rapid advancement in technology, fall detection system is made possible, enabling detection of falling events for the elderly, monitoring the patient and consequently provides emergency support in the event of falling. This paper presents a review of 3 main categories of fall detection techniques, ranging from year 2005 to year 2010. This paper will be focusing on discussing the techniques alongside with summary and conclusion for them.

Keywords: State of the art, fall detection, wearable devices, ambient analyser, motion detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2151
2738 A Serializability Condition for Multi-step Transactions Accessing Ordered Data

Authors: Rafat Alshorman, Walter Hussak

Abstract:

In mobile environments, unspecified numbers of transactions arrive in continuous streams. To prove correctness of their concurrent execution a method of modelling an infinite number of transactions is needed. Standard database techniques model fixed finite schedules of transactions. Lately, techniques based on temporal logic have been proposed as suitable for modelling infinite schedules. The drawback of these techniques is that proving the basic serializability correctness condition is impractical, as encoding (the absence of) conflict cyclicity within large sets of transactions results in prohibitively large temporal logic formulae. In this paper, we show that, under certain common assumptions on the graph structure of data items accessed by the transactions, conflict cyclicity need only be checked within all possible pairs of transactions. This results in formulae of considerably reduced size in any temporal-logic-based approach to proving serializability, and scales to arbitrary numbers of transactions.

Keywords: multi-step transactions, serializability, directed graph.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1359
2737 A Design and Implementation Model for Web Caching Using Server “URL Rewriting“

Authors: Mostafa E. Saleh, A. Abdel Nabi, A. Baith Mohamed

Abstract:

In order to make surfing the internet faster, and to save redundant processing load with each request for the same web page, many caching techniques have been developed to reduce latency of retrieving data on World Wide Web. In this paper we will give a quick overview of existing web caching techniques used for dynamic web pages then we will introduce a design and implementation model that take advantage of “URL Rewriting" feature in some popular web servers, e.g. Apache, to provide an effective approach of caching dynamic web pages.

Keywords: Web Caching, URL Rewriting, Optimizing Web Performance, Dynamic Web Pages Loading Time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1928
2736 Resident-Aware Green Home

Authors: Ahlam Elkilani, Bayan Elsheikh Ali, Rasha Abu Romman, Amjed Al-mousa, Belal Sababha

Abstract:

The amount of energy the world uses doubles every 20 years. Green homes play an important role in reducing the residential energy demand. This paper presents a platform that is intended to learn the behavior of home residents and build a profile about their habits and actions. The proposed resident aware home controller intervenes in the operation of home appliances in order to save energy without compromising the convenience of the residents. The presented platform can be used to simulate the actions and movements happening inside a home. The paper includes several optimization techniques that are meant to save energy in the home. In addition, several test scenarios are presented that show how the controller works. Moreover, this paper shows the computed actual savings when each of the presented techniques is implemented in a typical home. The test scenarios have validated that the techniques developed are capable of effectively saving energy at homes.

Keywords: Green Home, Resident Aware, Resident Profile, Activity Learning, Machine Learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2159
2735 Hybrid Approach for Software Defect Prediction Using Machine Learning with Optimization Technique

Authors: C. Manjula, Lilly Florence

Abstract:

Software technology is developing rapidly which leads to the growth of various industries. Now-a-days, software-based applications have been adopted widely for business purposes. For any software industry, development of reliable software is becoming a challenging task because a faulty software module may be harmful for the growth of industry and business. Hence there is a need to develop techniques which can be used for early prediction of software defects. Due to complexities in manual prediction, automated software defect prediction techniques have been introduced. These techniques are based on the pattern learning from the previous software versions and finding the defects in the current version. These techniques have attracted researchers due to their significant impact on industrial growth by identifying the bugs in software. Based on this, several researches have been carried out but achieving desirable defect prediction performance is still a challenging task. To address this issue, here we present a machine learning based hybrid technique for software defect prediction. First of all, Genetic Algorithm (GA) is presented where an improved fitness function is used for better optimization of features in data sets. Later, these features are processed through Decision Tree (DT) classification model. Finally, an experimental study is presented where results from the proposed GA-DT based hybrid approach is compared with those from the DT classification technique. The results show that the proposed hybrid approach achieves better classification accuracy.

Keywords: Decision tree, genetic algorithm, machine learning, software defect prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1465
2734 Contribution to the Query Optimization in the Object-Oriented Databases

Authors: Minyar Sassi, Amel Grissa-Touzi

Abstract:

Appeared toward 1986, the object-oriented databases management systems had not known successes knew five years after their birth. One of the major difficulties is the query optimization. We propose in this paper a new approach that permits to enrich techniques of query optimization existing in the object-oriented databases. Seen success that knew the query optimization in the relational model, our approach inspires itself of these optimization techniques and enriched it so that they can support the new concepts introduced by the object databases.

Keywords: Query, query optimization, relational databases, object-oriented databases.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1549
2733 CT Medical Images Denoising Based on New Wavelet Thresholding Compared with Curvelet and Contourlet

Authors: Amir Moslemi, Amir Movafeghi, Shahab Moradi

Abstract:

One of the most important challenging factors in medical images is nominated as noise. Image denoising refers to the improvement of a digital medical image that has been infected by Additive White Gaussian Noise (AWGN). The digital medical image or video can be affected by different types of noises. They are impulse noise, Poisson noise and AWGN. Computed tomography (CT) images are subjects to low quality due to the noise. Quality of CT images is dependent on absorbed dose to patients directly in such a way that increase in absorbed radiation, consequently absorbed dose to patients (ADP), enhances the CT images quality. In this manner, noise reduction techniques on purpose of images quality enhancement exposing no excess radiation to patients is one the challenging problems for CT images processing. In this work, noise reduction in CT images was performed using two different directional 2 dimensional (2D) transformations; i.e., Curvelet and Contourlet and Discrete Wavelet Transform (DWT) thresholding methods of BayesShrink and AdaptShrink, compared to each other and we proposed a new threshold in wavelet domain for not only noise reduction but also edge retaining, consequently the proposed method retains the modified coefficients significantly that result good visual quality. Data evaluations were accomplished by using two criterions; namely, peak signal to noise ratio (PSNR) and Structure similarity (Ssim).

Keywords: Computed Tomography (CT), noise reduction, curve-let, contour-let, Signal to Noise Peak-Peak Ratio (PSNR), Structure Similarity (Ssim), Absorbed Dose to Patient (ADP).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2921
2732 The Evaluation of Gravity Anomalies Based on Global Models by Land Gravity Data

Authors: M. Yilmaz, I. Yilmaz, M. Uysal

Abstract:

The Earth system generates different phenomena that are observable at the surface of the Earth such as mass deformations and displacements leading to plate tectonics, earthquakes, and volcanism. The dynamic processes associated with the interior, surface, and atmosphere of the Earth affect the three pillars of geodesy: shape of the Earth, its gravity field, and its rotation. Geodesy establishes a characteristic structure in order to define, monitor, and predict of the whole Earth system. The traditional and new instruments, observables, and techniques in geodesy are related to the gravity field. Therefore, the geodesy monitors the gravity field and its temporal variability in order to transform the geodetic observations made on the physical surface of the Earth into the geometrical surface in which positions are mathematically defined. In this paper, the main components of the gravity field modeling, (Free-air and Bouguer) gravity anomalies are calculated via recent global models (EGM2008, EIGEN6C4, and GECO) over a selected study area. The model-based gravity anomalies are compared with the corresponding terrestrial gravity data in terms of standard deviation (SD) and root mean square error (RMSE) for determining the best fit global model in the study area at a regional scale in Turkey. The least SD (13.63 mGal) and RMSE (15.71 mGal) were obtained by EGM2008 for the Free-air gravity anomaly residuals. For the Bouguer gravity anomaly residuals, EIGEN6C4 provides the least SD (8.05 mGal) and RMSE (8.12 mGal). The results indicated that EIGEN6C4 can be a useful tool for modeling the gravity field of the Earth over the study area.

Keywords: Free-air gravity anomaly, Bouguer gravity anomaly, global model, land gravity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 979
2731 Contextual Enablers and Behaviour Outputs for Action of Knowledge Workers

Authors: Juan-Gabriel Cegarra-Navarro, Alexeis Garcia-Perez, Denise Bedford

Abstract:

This paper provides guidelines for what constitutes a knowledge worker. Many graduates from non-managerial domains adopt, at some point in their professional careers, management roles at different levels, ranging from team leaders through to executive leadership. This is particularly relevant for professionals from an engineering background. Moving from a technical to an executive-level requires an understanding of those behaviour management techniques that can motivate and support individuals and their performance. Further, the transition to management also demands a shift of contextual enablers from tangible to intangible resources, which allows individuals to create new capacities, competencies, and capabilities. In this dynamic process, the knowledge worker becomes that key individual who can help members of the management board to transform information into relevant knowledge. However, despite its relevance in shaping the future of the organization in its transition to the knowledge economy, the role of a knowledge worker has not yet been studied to an appropriate level in the current literature. In this study, the authors review both the contextual enablers and behaviour outputs related to the role of the knowledge worker and relate these to their ability to deal with everyday management issues such as knowledge heterogeneity, varying motivations, information overload, or outdated information. This study highlights that the aggregate of capacities, competences and capabilities (CCCs) can be defined as knowledge structures, the study proposes several contextual enablers and behaviour outputs that knowledge workers can use to work cooperatively, acquire, distribute and knowledge. Therefore, this study contributes to a better comprehension of how CCCs can be managed at different levels through their contextual enablers and behaviour outputs.

Keywords: Knowledge workers, capacities, competences, capabilities, knowledge structures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 596