Search results for: inverse problem
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3748

Search results for: inverse problem

2938 Greedy Geographical Void Routing for Wireless Sensor Networks

Authors: Chiang Tzu-Chiang, Chang Jia-Lin, Tsai Yue-Fu, Li Sha-Pai

Abstract:

With the advantage of wireless network technology, there are a variety of mobile applications which make the issue of wireless sensor networks as a popular research area in recent years. As the wireless sensor network nodes move arbitrarily with the topology fast change feature, mobile nodes are often confronted with the void issue which will initiate packet losing, retransmitting, rerouting, additional transmission cost and power consumption. When transmitting packets, we would not predict void problem occurring in advance. Thus, how to improve geographic routing with void avoidance in wireless networks becomes an important issue. In this paper, we proposed a greedy geographical void routing algorithm to solve the void problem for wireless sensor networks. We use the information of source node and void area to draw two tangents to form a fan range of the existence void which can announce voidavoiding message. Then we use source and destination nodes to draw a line with an angle of the fan range to select the next forwarding neighbor node for routing. In a dynamic wireless sensor network environment, the proposed greedy void avoiding algorithm can be more time-saving and more efficient to forward packets, and improve current geographical void problem of wireless sensor networks.

Keywords: Wireless sensor network, internet routing, wireless network, greedy void avoiding algorithm, bypassing void.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3546
2937 Auto Classification for Search Intelligence

Authors: Lilac A. E. Al-Safadi

Abstract:

This paper proposes an auto-classification algorithm of Web pages using Data mining techniques. We consider the problem of discovering association rules between terms in a set of Web pages belonging to a category in a search engine database, and present an auto-classification algorithm for solving this problem that are fundamentally based on Apriori algorithm. The proposed technique has two phases. The first phase is a training phase where human experts determines the categories of different Web pages, and the supervised Data mining algorithm will combine these categories with appropriate weighted index terms according to the highest supported rules among the most frequent words. The second phase is the categorization phase where a web crawler will crawl through the World Wide Web to build a database categorized according to the result of the data mining approach. This database contains URLs and their categories.

Keywords: Information Processing on the Web, Data Mining, Document Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1599
2936 A Case Study in Using the Can-Sized Satellite Platforms for Interdisciplinary Problem-Based Learning in Aeronautical and Electronic Engineering

Authors: Michael Johnson, Vincenzo Oliveri

Abstract:

This work considers an interdisciplinary Problem-Based Learning (PBL) project developed by lecturers from the Aeronautical and Electronic and Computer Engineering departments at the University of Limerick. This “CANSAT” project utilises the CanSat can-sized satellite platform in order to allow students from aeronautical and electronic engineering to engage in a mixed format (online/face-to-face), interdisciplinary PBL assignment using a real-world platform and application. The project introduces students to the design, development, and construction of the CanSat system over the course of a single semester, enabling student(s) to apply their aeronautical and technical skills/capabilities to the realisation of a working CanSat system. In this case study, the CanSat kits are used to pivot the real-world, discipline-relevant PBL goal of designing, building, and testing the CanSat system with payload(s) from a traditional module-based setting to an online PBL setting. Feedback, impressions, benefits, and challenges identified through the semester are presented. Students found the project to be interesting and rewarding, with the interdisciplinary nature of the project appealing to them. Challenges and difficulties encountered are also addressed, with solutions developed between the students and facilitators to overcoming these discussed.

Keywords: Problem-Based Learning, Online PBL, Electronic Engineering, Aeronautical Engineering, Interdisciplinary Project, CanSat.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 416
2935 A Novel Approach of Route Choice in Stochastic Time-varying Networks

Authors: Siliang Wang, Minghui Wang

Abstract:

Many exist studies always use Markov decision processes (MDPs) in modeling optimal route choice in stochastic, time-varying networks. However, taking many variable traffic data and transforming them into optimal route decision is a computational challenge by employing MDPs in real transportation networks. In this paper we model finite horizon MDPs using directed hypergraphs. It is shown that the problem of route choice in stochastic, time-varying networks can be formulated as a minimum cost hyperpath problem, and it also can be solved in linear time. We finally demonstrate the significant computational advantages of the introduced methods.

Keywords: Markov decision processes (MDPs), stochastictime-varying networks, hypergraphs, route choice.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1524
2934 Application of Ant Colony Optimization for Multi-objective Production Problems

Authors: Teerapun Saeheaw, Nivit Charoenchai, Wichai Chattinnawat

Abstract:

This paper proposes a meta-heuristic called Ant Colony Optimization to solve multi-objective production problems. The multi-objective function is to minimize lead time and work in process. The problem is related to the decision variables, i.e.; distance and process time. According to decision criteria, the mathematical model is formulated. In order to solve the model an ant colony optimization approach has been developed. The proposed algorithm is parameterized by the number of ant colonies and the number of pheromone trails. One example is given to illustrate the effectiveness of the proposed model. The proposed formulations; Max-Min Ant system are then used to solve the problem and the results evaluate the performance and efficiency of the proposed algorithm using simulation.

Keywords: Ant colony optimization, multi-objective problems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1880
2933 Film Sensors for the Harsh Environment Application

Authors: Wenmin Qu

Abstract:

A capacitance level sensor with a segmented film electrode and a thin-film volume flow sensor with an innovative by-pass sleeve is presented as industrial products for the application in a harsh environment. The working principle of such sensors is well known; however, the traditional sensors show some limitations for certain industrial measurements. The two sensors presented in this paper overcome this limitation and enlarge the application spectrum. The problem is analyzed, and the solution is given. The emphasis of the paper is on developing the problem-solving concepts and the realization of the corresponding measuring circuits. These should give advice and encouragement, how we can still develop electronic measuring products in an almost saturated market.

Keywords: By-pass sleeve, charge transfer circuit, fixed ΔT circuit, harsh environment, industrial application, segmented electrode.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 463
2932 Marital Interactions in Predicting Treatment Outcome in Panic Disorder with Agoraphobia

Authors: Ghassan El-Baalbaki, Claude Bélanger, Michel Perreault, Steffany J. Fredman, Donald H. Baucom

Abstract:

This study had two goals. First, it investigated marital interaction variables as predictors of treatment outcome in panic disorder with agoraphobia (PDA) in sixty-five couples with one spouse suffering from PDA. Second, it analyzed the impact of PDA improvement, following therapy, on marital interaction patterns of both spouses. The partners were observed during a problem-solving task, before and after treatment. Negative behaviors at the outset of therapy, both in the PDA and the NPDA partners, predicted less improvement at post-test. It also appears that improvement in some PDA symptoms following therapy is linked to increase in the dominant behavior of the NPDA spouse and to an improvement in terms of his intrusiveness.

Keywords: Communication and problem-solving skills, Emotional overinvolvement, Marital relationship, Panic disorder withagoraphobia, Treatment outcome.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1368
2931 Development of an Immunoassay Platform for Diagnosis of Acute Kidney Injury

Authors: T. Bovornvirakit, K. Viravaidya

Abstract:

Acute kidney injury (AKI) is a new worldwide public health problem. A diagnosis of this disease using creatinine is still a problem in clinical practice. Therefore, a measurement of biomarkers responsible for AKI has received much attention in the past couple years. Cytokine interleukin-18 (IL-18) was reported as one of the early biomarkers for AKI. The most commonly used method to detect this biomarker is an immunoassay. This study used a planar platform to perform an immunoassay using fluorescence for detection. In this study, anti-IL-18 antibody was immobilized onto a microscope slide using a covalent binding method. Make-up samples were diluted at the concentration between 10 to 1000 pg/ml to create a calibration curve. The precision of the system was determined using a coefficient of variability (CV), which was found to be less than 10%. The performance of this immunoassay system was compared with the measurement from ELISA.

Keywords: Acute kidney injury, Acute renal failure, Antibody immobilization, Interleukin-18

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1588
2930 Performance Evaluation of Refinement Method for Wideband Two-Beams Formation

Authors: C. Bunsanit

Abstract:

This paper presents the refinement method for two beams formation of wideband smart antenna. The refinement method for weighting coefficients is based on Fully Spatial Signal Processing by taking Inverse Discrete Fourier Transform (IDFT), and its simulation results are presented using MATLAB. The radiation pattern is created by multiplying the incoming signal with real weights and then summing them together. These real weighting coefficients are computed by IDFT method; however, the range of weight values is relatively wide. Therefore, for reducing this range, the refinement method is used. The radiation pattern concerns with five input parameters to control. These parameters are maximum weighting coefficient, wideband signal, direction of mainbeam, beamwidth, and maximum of minor lobe level. Comparison of the obtained simulation results between using refinement method and taking only IDFT shows that the refinement method works well for wideband two beams formation.

Keywords: Fully spatial signal processing, beam forming, refinement method, smart antenna, weighting coefficient, wideband.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1061
2929 A Research on Inference from Multiple Distance Variables in Hedonic Regression – Focus on Three Variables

Authors: Yan Wang, Yasushi Asami, Yukio Sadahiro

Abstract:

In urban context, urban nodes such as amenity or hazard will certainly affect house price, while classic hedonic analysis will employ distance variables measured from each urban nodes. However, effects from distances to facilities on house prices generally do not represent the true price of the property. Distance variables measured on the same surface are suffering a problem called multicollinearity, which is usually presented as magnitude variance and mean value in regression, errors caused by instability. In this paper, we provided a theoretical framework to identify and gather the data with less bias, and also provided specific sampling method on locating the sample region to avoid the spatial multicollinerity problem in three distance variable’s case.

Keywords: Hedonic regression, urban node, distance variables, multicollinerity, collinearity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1969
2928 A New OvS Approach in an Assembly Line Balancing Problem

Authors: P. Azimi, B. Behtoiy

Abstract:

One of the most famous techniques which affect the efficiency of a production line is the assembly line balancing (ALB) technique. This paper examines the balancing effect of a whole production line of a real auto glass manufacturer in three steps. In the first step, processing time of each activity in the workstations is generated according to a practical approach. In the second step, the whole production process is simulated and the bottleneck stations have been identified, and finally in the third step, several improvement scenarios are generated to optimize the system throughput, and the best one is proposed. The main contribution of the current research is the proposed framework which combines two famous approaches including Assembly Line Balancing and Optimization via Simulation technique (OvS). The results show that the proposed framework could be applied in practical environments, easily.

Keywords: Assembly line balancing problem, optimization via simulation, production planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1785
2927 Comparative Performance of Artificial Bee Colony Based Algorithms for Wind-Thermal Unit Commitment

Authors: P. K. Singhal, R. Naresh, V. Sharma

Abstract:

This paper presents the three optimization models, namely New Binary Artificial Bee Colony (NBABC) algorithm, NBABC with Local Search (NBABC-LS), and NBABC with Genetic Crossover (NBABC-GC) for solving the Wind-Thermal Unit Commitment (WTUC) problem. The uncertain nature of the wind power is incorporated using the Weibull probability density function, which is used to calculate the overestimation and underestimation costs associated with the wind power fluctuation. The NBABC algorithm utilizes a mechanism based on the dissimilarity measure between binary strings for generating the binary solutions in WTUC problem. In NBABC algorithm, an intelligent scout bee phase is proposed that replaces the abandoned solution with the global best solution. The local search operator exploits the neighboring region of the current solutions, whereas the integration of genetic crossover with the NBABC algorithm increases the diversity in the search space and thus avoids the problem of local trappings encountered with the NBABC algorithm. These models are then used to decide the units on/off status, whereas the lambda iteration method is used to dispatch the hourly load demand among the committed units. The effectiveness of the proposed models is validated on an IEEE 10-unit thermal system combined with a wind farm over the planning period of 24 hours.

Keywords: Artificial bee colony algorithm, economic dispatch, unit commitment, wind power.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1048
2926 Comparative Performance of Artificial Bee Colony Based Algorithms for Wind-Thermal Unit Commitment

Authors: P. K. Singhal, R. Naresh, V. Sharma

Abstract:

This paper presents the three optimization models, namely New Binary Artificial Bee Colony (NBABC) algorithm, NBABC with Local Search (NBABC-LS), and NBABC with Genetic Crossover (NBABC-GC) for solving the Wind-Thermal Unit Commitment (WTUC) problem. The uncertain nature of the wind power is incorporated using the Weibull probability density function, which is used to calculate the overestimation and underestimation costs associated with the wind power fluctuation. The NBABC algorithm utilizes a mechanism based on the dissimilarity measure between binary strings for generating the binary solutions in WTUC problem. In NBABC algorithm, an intelligent scout bee phase is proposed that replaces the abandoned solution with the global best solution. The local search operator exploits the neighboring region of the current solutions, whereas the integration of genetic crossover with the NBABC algorithm increases the diversity in the search space and thus avoids the problem of local trappings encountered with the NBABC algorithm. These models are then used to decide the units on/off status, whereas the lambda iteration method is used to dispatch the hourly load demand among the committed units. The effectiveness of the proposed models is validated on an IEEE 10-unit thermal system combined with a wind farm over the planning period of 24 hours.

Keywords: Artificial bee colony algorithm, economic dispatch, unit commitment, wind power.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1161
2925 Analysis of Physicochemical Properties on Prediction of R5, X4 and R5X4 HIV-1 Coreceptor Usage

Authors: Kai-Ti Hsu, Hui-Ling Huang, Chun-Wei Tung, Yi-Hsiung Chen, Shinn-Ying Ho

Abstract:

Bioinformatics methods for predicting the T cell coreceptor usage from the array of membrane protein of HIV-1 are investigated. In this study, we aim to propose an effective prediction method for dealing with the three-class classification problem of CXCR4 (X4), CCR5 (R5) and CCR5/CXCR4 (R5X4). We made efforts in investigating the coreceptor prediction problem as follows: 1) proposing a feature set of informative physicochemical properties which is cooperated with SVM to achieve high prediction test accuracy of 81.48%, compared with the existing method with accuracy of 70.00%; 2) establishing a large up-to-date data set by increasing the size from 159 to 1225 sequences to verify the proposed prediction method where the mean test accuracy is 88.59%, and 3) analyzing the set of 14 informative physicochemical properties to further understand the characteristics of HIV-1coreceptors.

Keywords: Coreceptor, genetic algorithm, HIV-1, SVM, physicochemical properties, prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2365
2924 Interactive Fuzzy Multi-objective Programming in Land Re-organisational Planning for Sustainable Rural Development

Authors: Bijaya Krushna Mangaraj, Deepak Kumar Das

Abstract:

Sustainability in rural production system can only be achieved if it can suitably satisfy the local requirement as well as the outside demand with the changing time. With the increased pressure from the food sector in a globalised world, the agrarian economy needs to re-organise its cultivable land system to be compatible with new management practices as well as the multiple needs of various stakeholders and the changing resource scenario. An attempt has been made to transform this problem into a multi-objective decisionmaking problem considering various objectives, resource constraints and conditional constraints. An interactive fuzzy multi-objective programming approach has been used for such a purpose taking a case study in Indian context to demonstrate the validity of the method.

Keywords: Land re-organisation, Crop planning, Multiobjective Decision-Making, Fuzzy Goal Programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1437
2923 Buckling of Plates on Foundation with Different Types of Sides Support

Authors: Ali N. Suri, Ahmad A. Al-Makhlufi

Abstract:

In this paper the problem of buckling of plates on foundation of finite length and with different side support is studied.

The Finite Strip Method is used as tool for the analysis. This method uses finite strip elastic, foundation, and geometric matrices to build the assembly matrices for the whole structure, then after introducing boundary conditions at supports, the resulting reduced matrices is transformed into a standard Eigenvalue-Eigenvector problem. The solution of this problem will enable the determination of the buckling load, the associated buckling modes and the buckling wave length.

To carry out the buckling analysis starting from the elastic, foundation, and geometric stiffness matrices for each strip a computer program FORTRAN list is developed.

Since stiffness matrices are function of wave length of buckling, the computer program used an iteration procedure to find the critical buckling stress for each value of foundation modulus and for each boundary condition.

The results showed the use of elastic medium to support plates subject to axial load increase a great deal the buckling load, the results found are very close with those obtained by other analytical methods and experimental work.

The results also showed that foundation compensates the effect of the weakness of some types of constraint of side support and maximum benefit found for plate with one side simply supported the other free.

Keywords: Buckling, Finite Strip, Different Sides Support, Plates on Foundation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2121
2922 Visual Hull with Imprecise Input

Authors: Peng He

Abstract:

Imprecision is a long-standing problem in CAD design and high accuracy image-based reconstruction applications. The visual hull which is the closed silhouette equivalent shape of the objects of interest is an important concept in image-based reconstruction. We extend the domain-theoretic framework, which is a robust and imprecision capturing geometric model, to analyze the imprecision in the output shape when the input vertices are given with imprecision. Under this framework, we show an efficient algorithm to generate the 2D partial visual hull which represents the exact information of the visual hull with only basic imprecision assumptions. We also show how the visual hull from polyhedra problem can be efficiently solved in the context of imprecise input.

Keywords: Geometric Domain, Computer Vision, Computational Geometry, Visual Hull, Image-Based reconstruction, Imprecise Input, CAD object

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1456
2921 A Sparse Representation Speech Denoising Method Based on Adapted Stopping Residue Error

Authors: Qianhua He, Weili Zhou, Aiwu Chen

Abstract:

A sparse representation speech denoising method based on adapted stopping residue error was presented in this paper. Firstly, the cross-correlation between the clean speech spectrum and the noise spectrum was analyzed, and an estimation method was proposed. In the denoising method, an over-complete dictionary of the clean speech power spectrum was learned with the K-singular value decomposition (K-SVD) algorithm. In the sparse representation stage, the stopping residue error was adaptively achieved according to the estimated cross-correlation and the adjusted noise spectrum, and the orthogonal matching pursuit (OMP) approach was applied to reconstruct the clean speech spectrum from the noisy speech. Finally, the clean speech was re-synthesised via the inverse Fourier transform with the reconstructed speech spectrum and the noisy speech phase. The experiment results show that the proposed method outperforms the conventional methods in terms of subjective and objective measure.

Keywords: Speech denoising, sparse representation, K-singular value decomposition, orthogonal matching pursuit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 990
2920 Choosing Search Algorithms in Bayesian Optimization Algorithm

Authors: Hao Wu, Jonathan L. Shapiro

Abstract:

The Bayesian Optimization Algorithm (BOA) is an algorithm based on the estimation of distributions. It uses techniques from modeling data by Bayesian networks to estimating the joint distribution of promising solutions. To obtain the structure of Bayesian network, different search algorithms can be used. The key point that BOA addresses is whether the constructed Bayesian network could generate new and useful solutions (strings), which could lead the algorithm in the right direction to solve the problem. Undoubtedly, this ability is a crucial factor of the efficiency of BOA. Varied search algorithms can be used in BOA, but their performances are different. For choosing better ones, certain suitable method to present their ability difference is needed. In this paper, a greedy search algorithm and a stochastic search algorithm are used in BOA to solve certain optimization problem. A method using Kullback-Leibler (KL) Divergence to reflect their difference is described.

Keywords: Bayesian optimization algorithm, greedy search, KL divergence, stochastic search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1671
2919 Product Features Extraction from Opinions According to Time

Authors: Kamal Amarouche, Houda Benbrahim, Ismail Kassou

Abstract:

Nowadays, e-commerce shopping websites have experienced noticeable growth. These websites have gained consumers’ trust. After purchasing a product, many consumers share comments where opinions are usually embedded about the given product. Research on the automatic management of opinions that gives suggestions to potential consumers and portrays an image of the product to manufactures has been growing recently. After launching the product in the market, the reviews generated around it do not usually contain helpful information or generic opinions about this product (e.g. telephone: great phone...); in the sense that the product is still in the launching phase in the market. Within time, the product becomes old. Therefore, consumers perceive the advantages/ disadvantages about each specific product feature. Therefore, they will generate comments that contain their sentiments about these features. In this paper, we present an unsupervised method to extract different product features hidden in the opinions which influence its purchase, and that combines Time Weighting (TW) which depends on the time opinions were expressed with Term Frequency-Inverse Document Frequency (TF-IDF). We conduct several experiments using two different datasets about cell phones and hotels. The results show the effectiveness of our automatic feature extraction, as well as its domain independent characteristic.

Keywords: Opinion mining, product feature extraction, sentiment analysis, SentiWordNet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1281
2918 Simulation Modeling of Fire Station Locations under Traffic Obstacles

Authors: Mehmet Savsar

Abstract:

Facility location problem involves locating a facility to optimize some performance measures. Location of a public facility to serve the community, such as a fire station, significantly affects its service quality. Main objective in locating a fire station is to minimize the response time, which is the time duration between receiving a call and reaching the place of incident. In metropolitan areas, fire vehicles need to cross highways and other traffic obstacles through some obstacle-overcoming points which delay the response time. In this paper, fire station location problem is analyzed. Simulation models are developed for the location problems which involve obstacles. Particular case problems are analyzed and the results are presented.

Keywords: Public Facility Location, Fire Stations, Response Time, Fire Vehicle Delays.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2389
2917 Modeling and Simulation for 3D Eddy Current Testing in Conducting Materials

Authors: S. Bennoud, M. Zergoug

Abstract:

The numerical simulation of electromagnetic interactions is still a challenging problem, especially in problems that result in fully three dimensional mathematical models.

The goal of this work is to use mathematical modeling to characterize the reliability and capacity of eddy current technique to detect and characterize defects embedded in aeronautical in-service pieces.

The finite element method is used for describing the eddy current technique in a mathematical model by the prediction of the eddy current interaction with defects. However, this model is an approximation of the full Maxwell equations.

In this study, the analysis of the problem is based on a three dimensional finite element model that computes directly the electromagnetic field distortions due to defects.

Keywords: Eddy current, Finite element method, Non destructive testing, Numerical simulations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3095
2916 Relation between Roots and Tangent Lines of Function in Fractional Dimensions: A Method for Optimization Problems

Authors: Ali Dorostkar

Abstract:

In this paper, a basic schematic of fractional dimensional optimization problem is presented. As will be shown, a method is performed based on a relation between roots and tangent lines of function in fractional dimensions for an arbitrary initial point. It is shown that for each polynomial function with order N at least N tangent lines must be existed in fractional dimensions of 0 < α < N+1 which pass exactly through the all roots of the proposed function. Geometrical analysis of tangent lines in fractional dimensions is also presented to clarify more intuitively the proposed method. Results show that with an appropriate selection of fractional dimensions, we can directly find the roots. Method is presented for giving a different direction of optimization problems by the use of fractional dimensions.

Keywords: Tangent line, fractional dimension, root, optimization problem.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 538
2915 A Normalization-based Robust Image Watermarking Scheme Using SVD and DCT

Authors: Say Wei Foo, Qi Dong

Abstract:

Digital watermarking is one of the techniques for copyright protection. In this paper, a normalization-based robust image watermarking scheme which encompasses singular value decomposition (SVD) and discrete cosine transform (DCT) techniques is proposed. For the proposed scheme, the host image is first normalized to a standard form and divided into non-overlapping image blocks. SVD is applied to each block. By concatenating the first singular values (SV) of adjacent blocks of the normalized image, a SV block is obtained. DCT is then carried out on the SV blocks to produce SVD-DCT blocks. A watermark bit is embedded in the highfrequency band of a SVD-DCT block by imposing a particular relationship between two pseudo-randomly selected DCT coefficients. An adaptive frequency mask is used to adjust local watermark embedding strength. Watermark extraction involves mainly the inverse process. The watermark extracting method is blind and efficient. Experimental results show that the quality degradation of watermarked image caused by the embedded watermark is visually transparent. Results also show that the proposed scheme is robust against various image processing operations and geometric attacks.

Keywords: Image watermarking, Image normalization, Singularvalue decomposition, Discrete cosine transform, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2071
2914 Multimedia Data Fusion for Event Detection in Twitter by Using Dempster-Shafer Evidence Theory

Authors: Samar M. Alqhtani, Suhuai Luo, Brian Regan

Abstract:

Data fusion technology can be the best way to extract useful information from multiple sources of data. It has been widely applied in various applications. This paper presents a data fusion approach in multimedia data for event detection in twitter by using Dempster-Shafer evidence theory. The methodology applies a mining algorithm to detect the event. There are two types of data in the fusion. The first is features extracted from text by using the bag-ofwords method which is calculated using the term frequency-inverse document frequency (TF-IDF). The second is the visual features extracted by applying scale-invariant feature transform (SIFT). The Dempster - Shafer theory of evidence is applied in order to fuse the information from these two sources. Our experiments have indicated that comparing to the approaches using individual data source, the proposed data fusion approach can increase the prediction accuracy for event detection. The experimental result showed that the proposed method achieved a high accuracy of 0.97, comparing with 0.93 with texts only, and 0.86 with images only.

Keywords: Data fusion, Dempster-Shafer theory, data mining, event detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1783
2913 Solving an Extended Resource Leveling Problem with Multiobjective Evolutionary Algorithms

Authors: Javier Roca, Etienne Pugnaghi, Gaëtan Libert

Abstract:

We introduce an extended resource leveling model that abstracts real life projects that consider specific work ranges for each resource. Contrary to traditional resource leveling problems this model considers scarce resources and multiple objectives: the minimization of the project makespan and the leveling of each resource usage over time. We formulate this model as a multiobjective optimization problem and we propose a multiobjective genetic algorithm-based solver to optimize it. This solver consists in a two-stage process: a main stage where we obtain non-dominated solutions for all the objectives, and a postprocessing stage where we seek to specifically improve the resource leveling of these solutions. We propose an intelligent encoding for the solver that allows including domain specific knowledge in the solving mechanism. The chosen encoding proves to be effective to solve leveling problems with scarce resources and multiple objectives. The outcome of the proposed solvers represent optimized trade-offs (alternatives) that can be later evaluated by a decision maker, this multi-solution approach represents an advantage over the traditional single solution approach. We compare the proposed solver with state-of-art resource leveling methods and we report competitive and performing results.

Keywords: Intelligent problem encoding, multiobjective decision making, evolutionary computing, RCPSP, resource leveling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4168
2912 A Soft Set based Group Decision Making Method with Criteria Weight

Authors: Samsiah Abdul Razak, Daud Mohamad

Abstract:

Molodstov-s soft sets theory was originally proposed as general mathematical tool for dealing with uncertainty problems. The matrix form has been introduced in soft set and some of its properties have been discussed. However, the formulation of soft matrix in group decision making problem only with equal importance weights of criteria, which does not show the true opinion of decision maker on each criteria. The aim of this paper is to propose a method for solving group decision making problem incorporating the importance of criteria by using soft matrices in a more objective manner. The weight of each criterion is calculated by using the Analytic Hierarchy Process (AHP) method. An example of house selection process is given to illustrate the effectiveness of the proposed method.

Keywords: Soft set, Soft Matrix, Soft max-min decision making (SMmDM), Analytic hierarchy process (AHP)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1880
2911 Codebook Generation for Vector Quantization on Orthogonal Polynomials based Transform Coding

Authors: R. Krishnamoorthi, N. Kannan

Abstract:

In this paper, a new algorithm for generating codebook is proposed for vector quantization (VQ) in image coding. The significant features of the training image vectors are extracted by using the proposed Orthogonal Polynomials based transformation. We propose to generate the codebook by partitioning these feature vectors into a binary tree. Each feature vector at a non-terminal node of the binary tree is directed to one of the two descendants by comparing a single feature associated with that node to a threshold. The binary tree codebook is used for encoding and decoding the feature vectors. In the decoding process the feature vectors are subjected to inverse transformation with the help of basis functions of the proposed Orthogonal Polynomials based transformation to get back the approximated input image training vectors. The results of the proposed coding are compared with the VQ using Discrete Cosine Transform (DCT) and Pairwise Nearest Neighbor (PNN) algorithm. The new algorithm results in a considerable reduction in computation time and provides better reconstructed picture quality.

Keywords: Orthogonal Polynomials, Image Coding, Vector Quantization, TSVQ, Binary Tree Classifier

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2127
2910 Model Discovery and Validation for the Qsar Problem using Association Rule Mining

Authors: Luminita Dumitriu, Cristina Segal, Marian Craciun, Adina Cocu, Lucian P. Georgescu

Abstract:

There are several approaches in trying to solve the Quantitative 1Structure-Activity Relationship (QSAR) problem. These approaches are based either on statistical methods or on predictive data mining. Among the statistical methods, one should consider regression analysis, pattern recognition (such as cluster analysis, factor analysis and principal components analysis) or partial least squares. Predictive data mining techniques use either neural networks, or genetic programming, or neuro-fuzzy knowledge. These approaches have a low explanatory capability or non at all. This paper attempts to establish a new approach in solving QSAR problems using descriptive data mining. This way, the relationship between the chemical properties and the activity of a substance would be comprehensibly modeled.

Keywords: association rules, classification, data mining, Quantitative Structure - Activity Relationship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763
2909 Numerical Investigation of Non Fourier Heat Conduction in a Semi-infinite Body due to a Moving Concentrated Heat Source Composed with Radiational Boundary Condition

Authors: M. Akbari, S. Sadodin

Abstract:

In this paper, the melting of a semi-infinite body as a result of a moving laser beam has been studied. Because the Fourier heat transfer equation at short times and large dimensions does not have sufficient accuracy; a non-Fourier form of heat transfer equation has been used. Due to the fact that the beam is moving in x direction, the temperature distribution and the melting pool shape are not asymmetric. As a result, the problem is a transient threedimensional problem. Therefore, thermophysical properties such as heat conductivity coefficient, density and heat capacity are functions of temperature and material states. The enthalpy technique, used for the solution of phase change problems, has been used in an explicit finite volume form for the hyperbolic heat transfer equation. This technique has been used to calculate the transient temperature distribution in the semi-infinite body and the growth rate of the melt pool. In order to validate the numerical results, comparisons were made with experimental data. Finally, the results of this paper were compared with similar problem that has used the Fourier theory. The comparison shows the influence of infinite speed of heat propagation in Fourier theory on the temperature distribution and the melt pool size.

Keywords: Non-Fourier, Enthalpy technique, Melt pool, Radiational boundary condition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1962