Search results for: Weighted vertex cover
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 690

Search results for: Weighted vertex cover

510 A Numerical Simulation of Solar Distillation for Installation in Chabahar-Iran

Authors: Masoud Afrand, Amin Behzadmehr, Arash Karimipour

Abstract:

The world demand for potable water is increasing every day with growing population. Desalination using solar energy is suitable for potable water production from brackish and seawater. In this paper, we present a theoretical study of solar distillation in a single basin under the open environmental conditions of Chabahar-Iran. The still has a base area of 2000mm×500mm with a glass cover inclined at 25° in order to obtain extra solar energy. We model the still and conduct its energy balance equations under minor assumptions. We computed the temperatures of glass cover, seawater interface, moist air and bottom using numerical method. The investigation addressed the following: The still productivity, distilled water salinity and still performance in terms of the still efficiency. Calculated still productivity in July was higher than December. So in this paper, we show that still productivity is directly functioning of solar radiation.

Keywords: Inclined Solar still, Solar energy, Solar desalination, Numerical Simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2835
509 Markov Random Field-Based Segmentation Algorithm for Detection of Land Cover Changes Using Uninhabited Aerial Vehicle Synthetic Aperture Radar Polarimetric Images

Authors: Mehrnoosh Omati, Mahmod Reza Sahebi

Abstract:

The information on land use/land cover changing plays an essential role for environmental assessment, planning and management in regional development. Remotely sensed imagery is widely used for providing information in many change detection applications. Polarimetric Synthetic aperture radar (PolSAR) image, with the discrimination capability between different scattering mechanisms, is a powerful tool for environmental monitoring applications. This paper proposes a new boundary-based segmentation algorithm as a fundamental step for land cover change detection. In this method, first, two PolSAR images are segmented using integration of marker-controlled watershed algorithm and coupled Markov random field (MRF). Then, object-based classification is performed to determine changed/no changed image objects. Compared with pixel-based support vector machine (SVM) classifier, this novel segmentation algorithm significantly reduces the speckle effect in PolSAR images and improves the accuracy of binary classification in object-based level. The experimental results on Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) polarimetric images show a 3% and 6% improvement in overall accuracy and kappa coefficient, respectively. Also, the proposed method can correctly distinguish homogeneous image parcels.

Keywords: Coupled Markov random field, environment, object-based analysis, Polarimetric SAR images.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 821
508 Hamiltonian Related Properties with and without Faults of the Dual-Cube Interconnection Network and Their Variations

Authors: Shih-Yan Chen, Shin-Shin Kao

Abstract:

In this paper, a thorough review about dual-cubes, DCn, the related studies and their variations are given. DCn was introduced to be a network which retains the pleasing properties of hypercube Qn but has a much smaller diameter. In fact, it is so constructed that the number of vertices of DCn is equal to the number of vertices of Q2n +1. However, each vertex in DCn is adjacent to n + 1 neighbors and so DCn has (n + 1) × 2^2n edges in total, which is roughly half the number of edges of Q2n+1. In addition, the diameter of any DCn is 2n +2, which is of the same order of that of Q2n+1. For selfcompleteness, basic definitions, construction rules and symbols are provided. We chronicle the results, where eleven significant theorems are presented, and include some open problems at the end.

Keywords: Hypercubes, dual-cubes, fault-tolerant hamiltonian property, dual-cube extensive networks, dual-cube-like networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1423
507 A Data Hiding Model with High Security Features Combining Finite State Machines and PMM method

Authors: Souvik Bhattacharyya, Gautam Sanyal

Abstract:

Recent years have witnessed the rapid development of the Internet and telecommunication techniques. Information security is becoming more and more important. Applications such as covert communication, copyright protection, etc, stimulate the research of information hiding techniques. Traditionally, encryption is used to realize the communication security. However, important information is not protected once decoded. Steganography is the art and science of communicating in a way which hides the existence of the communication. Important information is firstly hidden in a host data, such as digital image, video or audio, etc, and then transmitted secretly to the receiver.In this paper a data hiding model with high security features combining both cryptography using finite state sequential machine and image based steganography technique for communicating information more securely between two locations is proposed. The authors incorporated the idea of secret key for authentication at both ends in order to achieve high level of security. Before the embedding operation the secret information has been encrypted with the help of finite-state sequential machine and segmented in different parts. The cover image is also segmented in different objects through normalized cut.Each part of the encoded secret information has been embedded with the help of a novel image steganographic method (PMM) on different cuts of the cover image to form different stego objects. Finally stego image is formed by combining different stego objects and transmit to the receiver side. At the receiving end different opposite processes should run to get the back the original secret message.

Keywords: Cover Image, Finite state sequential machine, Melaymachine, Pixel Mapping Method (PMM), Stego Image, NCUT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2214
506 Guideline for Happy Living According to Sufficiency Economy Philosophy of People and Community Leaders in Urban Communities

Authors: Phusit Phukamchanoad

Abstract:

This research was to analyze personality’s activities based on sufficiency economy philosophy of people and community leaders in urban communities. The data were collected through questionnaires administered to 392 people and interviewed with community leaders. It was found that most people revealed that their lives depend on activities in accordance with the sufficiency economy philosophy in high level especially, being honest and aware on sufficiency, occupations, peacefulness in the community leaders’ side, they reported on extravagant reduction, planting home vegetable garden, having household accounting, expense planning by dividing into 3 categories; 1) saving for illness cover 2) saving for business cover, and 3) household daily expense. The samples were also adjusted their livings quite well with the rapid change of urbanization. Although those people have encountered with any hardships, their honesty in occupations and awareness on sufficiency remain to survive happily.

Keywords: Sufficiency Economy Philosophy, individual and household activities, urban community.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3150
505 Image Segment Matching Using Affine- Invariant Regions

Authors: Ibrahim El rube'

Abstract:

In this paper, a method for matching image segments using triangle-based (geometrical) regions is proposed. Triangular regions are formed from triples of vertex points obtained from a keypoint detector (SIFT). However, triangle regions are subject to noise and distortion around the edges and vertices (especially acute angles). Therefore, these triangles are expanded into parallelogramshaped regions. The extracted image segments inherit an important triangle property; the invariance to affine distortion. Given two images, matching corresponding regions is conducted by computing the relative affine matrix, rectifying one of the regions w.r.t. the other one, then calculating the similarity between the reference and rectified region. The experimental tests show the efficiency and robustness of the proposed algorithm against geometrical distortion.

Keywords: Image matching, key point detection, affine invariant, triangle-shaped segments.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1848
504 Seamless Handover in Urban 5G-UAV Systems Using Entropy Weighted Method

Authors: Anirudh Sunil Warrier, Saba Al-Rubaye, Dimitrios Panagiotakopoulos, Gokhan Inalhan, Antonios Tsourdos

Abstract:

The demand for increased data transfer rate and network traffic capacity has given rise to the concept of heterogeneous networks. Heterogeneous networks are wireless networks, consisting of devices using different underlying radio access technologies (RAT). For Unmanned Aerial Vehicles (UAVs) this enhanced data rate and network capacity are even more critical especially in their applications of medicine, delivery missions and military. In an urban heterogeneous network environment, the UAVs must be able switch seamlessly from one base station (BS) to another for maintaining a reliable link. Therefore, seamless handover in such urban environments has become a major challenge. In this paper, a scheme to achieve seamless handover is developed, an algorithm based on Received Signal Strength (RSS) criterion for network selection is used and Entropy Weighted Method (EWM) is implemented for decision making. Seamless handover using EWM decision-making is demonstrated successfully for a UAV moving across fifth generation (5G) and long-term evolution (LTE) networks via a simulation level analysis. Thus, a solution for UAV-5G communication, specifically the mobility challenge in heterogeneous networks is solved and this work could act as step forward in making UAV-5G architecture integration a possibility.

Keywords: Air to ground, A2G, fifth generation, 5G, handover, mobility, unmanned aerial vehicle, UAV, urban environments.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 342
503 Coverage and Connectivity Problem in Sensor Networks

Authors: Meenakshi Bansal, Iqbal Singh, Parvinder S. Sandhu

Abstract:

In over deployed sensor networks, one approach to Conserve energy is to keep only a small subset of sensors active at Any instant. For the coverage problems, the monitoring area in a set of points that require sensing, called demand points, and consider that the node coverage area is a circle of range R, where R is the sensing range, If the Distance between a demand point and a sensor node is less than R, the node is able to cover this point. We consider a wireless sensor network consisting of a set of sensors deployed randomly. A point in the monitored area is covered if it is within the sensing range of a sensor. In some applications, when the network is sufficiently dense, area coverage can be approximated by guaranteeing point coverage. In this case, all the points of wireless devices could be used to represent the whole area, and the working sensors are supposed to cover all the sensors. We also introduce Hybrid Algorithm and challenges related to coverage in sensor networks.

Keywords: Wireless sensor networks, network coverage, Energy conservation, Hybrid Algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1680
502 The Study of the Intelligent Fuzzy Weighted Input Estimation Method Combined with the Experiment Verification for the Multilayer Materials

Authors: Ming-Hui Lee, Tsung-Chien Chen, Tsu-Ping Yu, Horng-Yuan Jang

Abstract:

The innovative intelligent fuzzy weighted input estimation method (FWIEM) can be applied to the inverse heat transfer conduction problem (IHCP) to estimate the unknown time-varying heat flux of the multilayer materials as presented in this paper. The feasibility of this method can be verified by adopting the temperature measurement experiment. The experiment modular may be designed by using the copper sample which is stacked up 4 aluminum samples with different thicknesses. Furthermore, the bottoms of copper samples are heated by applying the standard heat source, and the temperatures on the tops of aluminum are measured by using the thermocouples. The temperature measurements are then regarded as the inputs into the presented method to estimate the heat flux in the bottoms of copper samples. The influence on the estimation caused by the temperature measurement of the sample with different thickness, the processing noise covariance Q, the weighting factor γ , the sampling time interval Δt , and the space discrete interval Δx , will be investigated by utilizing the experiment verification. The results show that this method is efficient and robust to estimate the unknown time-varying heat input of the multilayer materials.

Keywords: Multilayer Materials, Input Estimation Method, IHCP, Heat Flux.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1188
501 Conflation Methodology Applied to Flood Recovery

Authors: E. L. Suarez, D. E. Meeroff, Y. Yong

Abstract:

Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.

Keywords: Community resilience, conflation, flood risk, nuisance flooding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 48
500 The Comparison of Some Soil Quality Indexes in Different Land uses of Ghareh Aghaj Watershed of Semirom, Isfahan, Iran

Authors: Bahareh Aghasi, Ahmad Jalalian, Naser Honarjoo

Abstract:

Land use change, if not based on proper scientific investigation affects other physical, chemical, and biological properties of soil and leading to increased destruction and erosion. It was imperative to study the effects of changing rangelands to farmlands on some Soil quality indexes. Undisturbed soil samples were collected from the depths of 0-10 and 10-30 centimeter in pasture with good vegetation cover(GP), pasture with medium vegetation cover(MP), abandoned dry land farming(ADF) and degraded dry land farming(DDF) land uses in Ghareh Aghaj watershed of Isfahan province. The results revealed that organic matter(OM), cation exchange capacity(CEC) and available potassium(AK) decreasing in the depth of 0-10 centimeter were 66.6, 38.8 and 70 percent and in the depth of 10-30 centimeter were 58, 61.4 and 83.5 percent respectively in DDF comparison with GP. Concerning to the results, it seems that land use change can decrease soil quality and increase soil degradation and lead in undesirable consequences.

Keywords: Land use change, Soil degradation, Soil quality

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1622
499 Intelligent Recognition of Diabetes Disease via FCM Based Attribute Weighting

Authors: Kemal Polat

Abstract:

In this paper, an attribute weighting method called fuzzy C-means clustering based attribute weighting (FCMAW) for classification of Diabetes disease dataset has been used. The aims of this study are to reduce the variance within attributes of diabetes dataset and to improve the classification accuracy of classifier algorithm transforming from non-linear separable datasets to linearly separable datasets. Pima Indians Diabetes dataset has two classes including normal subjects (500 instances) and diabetes subjects (268 instances). Fuzzy C-means clustering is an improved version of K-means clustering method and is one of most used clustering methods in data mining and machine learning applications. In this study, as the first stage, fuzzy C-means clustering process has been used for finding the centers of attributes in Pima Indians diabetes dataset and then weighted the dataset according to the ratios of the means of attributes to centers of theirs. Secondly, after weighting process, the classifier algorithms including support vector machine (SVM) and k-NN (k- nearest neighbor) classifiers have been used for classifying weighted Pima Indians diabetes dataset. Experimental results show that the proposed attribute weighting method (FCMAW) has obtained very promising results in the classification of Pima Indians diabetes dataset.

Keywords: Fuzzy C-means clustering, Fuzzy C-means clustering based attribute weighting, Pima Indians diabetes dataset, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1714
498 Data Hiding in Images in Discrete Wavelet Domain Using PMM

Authors: Souvik Bhattacharyya, Gautam Sanyal

Abstract:

Over last two decades, due to hostilities of environment over the internet the concerns about confidentiality of information have increased at phenomenal rate. Therefore to safeguard the information from attacks, number of data/information hiding methods have evolved mostly in spatial and transformation domain.In spatial domain data hiding techniques,the information is embedded directly on the image plane itself. In transform domain data hiding techniques the image is first changed from spatial domain to some other domain and then the secret information is embedded so that the secret information remains more secure from any attack. Information hiding algorithms in time domain or spatial domain have high capacity and relatively lower robustness. In contrast, the algorithms in transform domain, such as DCT, DWT have certain robustness against some multimedia processing.In this work the authors propose a novel steganographic method for hiding information in the transform domain of the gray scale image.The proposed approach works by converting the gray level image in transform domain using discrete integer wavelet technique through lifting scheme.This approach performs a 2-D lifting wavelet decomposition through Haar lifted wavelet of the cover image and computes the approximation coefficients matrix CA and detail coefficients matrices CH, CV, and CD.Next step is to apply the PMM technique in those coefficients to form the stego image. The aim of this paper is to propose a high-capacity image steganography technique that uses pixel mapping method in integer wavelet domain with acceptable levels of imperceptibility and distortion in the cover image and high level of overall security. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.

Keywords: Cover Image, Pixel Mapping Method (PMM), StegoImage, Integer Wavelet Tranform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2794
497 Biometric Steganography Using Variable Length Embedding

Authors: Souvik Bhattacharyya, Indradip Banerjee, Anumoy Chakraborty, Gautam Sanyal

Abstract:

Recent growth in digital multimedia technologies has presented a lot of facilities in information transmission, reproduction and manipulation. Therefore, the concept of information security is one of the superior articles in the present day situation. The biometric information security is one of the information security mechanisms. It has the advantages as well as disadvantages. The biometric system is at risk to a range of attacks. These attacks are anticipated to bypass the security system or to suspend the normal functioning. Various hazards have been discovered while using biometric system. Proper use of steganography greatly reduces the risks in biometric systems from the hackers. Steganography is one of the fashionable information hiding technique. The goal of steganography is to hide information inside a cover medium like text, image, audio, video etc. through which it is not possible to detect the existence of the secret information. Here in this paper a new security concept has been established by making the system more secure with the help of steganography along with biometric security. Here the biometric information has been embedded to a skin tone portion of an image with the help of proposed steganographic technique.

Keywords: Biometrics, Skin tone detection, Series, Polynomial, Cover Image, Stego Image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2564
496 Image Restoration in Non-Linear Filtering Domain using MDB approach

Authors: S. K. Satpathy, S. Panda, K. K. Nagwanshi, C. Ardil

Abstract:

This paper proposes a new technique based on nonlinear Minmax Detector Based (MDB) filter for image restoration. The aim of image enhancement is to reconstruct the true image from the corrupted image. The process of image acquisition frequently leads to degradation and the quality of the digitized image becomes inferior to the original image. Image degradation can be due to the addition of different types of noise in the original image. Image noise can be modeled of many types and impulse noise is one of them. Impulse noise generates pixels with gray value not consistent with their local neighborhood. It appears as a sprinkle of both light and dark or only light spots in the image. Filtering is a technique for enhancing the image. Linear filter is the filtering in which the value of an output pixel is a linear combination of neighborhood values, which can produce blur in the image. Thus a variety of smoothing techniques have been developed that are non linear. Median filter is the one of the most popular non-linear filter. When considering a small neighborhood it is highly efficient but for large window and in case of high noise it gives rise to more blurring to image. The Centre Weighted Mean (CWM) filter has got a better average performance over the median filter. However the original pixel corrupted and noise reduction is substantial under high noise condition. Hence this technique has also blurring affect on the image. To illustrate the superiority of the proposed approach, the proposed new scheme has been simulated along with the standard ones and various restored performance measures have been compared.

Keywords: Filtering, Minmax Detector Based (MDB), noise, centre weighted mean filter, PSNR, restoration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2697
495 Inferring User Preference Using Distance Dependent Chinese Restaurant Process and Weighted Distribution for a Content Based Recommender System

Authors: Bagher Rahimpour Cami, Hamid Hassanpour, Hoda Mashayekhi

Abstract:

Nowadays websites provide a vast number of resources for users. Recommender systems have been developed as an essential element of these websites to provide a personalized environment for users. They help users to retrieve interested resources from large sets of available resources. Due to the dynamic feature of user preference, constructing an appropriate model to estimate the user preference is the major task of recommender systems. Profile matching and latent factors are two main approaches to identify user preference. In this paper, we employed the latent factor and profile matching to cluster the user profile and identify user preference, respectively. The method uses the Distance Dependent Chines Restaurant Process as a Bayesian nonparametric framework to extract the latent factors from the user profile. These latent factors are mapped to user interests and a weighted distribution is used to identify user preferences. We evaluate the proposed method using a real-world data-set that contains news tweets of a news agency (BBC). The experimental results and comparisons show the superior recommendation accuracy of the proposed approach related to existing methods, and its ability to effectively evolve over time.

Keywords: Content-based recommender systems, dynamic user modeling, extracting user interests, predicting user preference.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 771
494 High Secure Data Hiding Using Cropping Image and Least Significant Bit Steganography

Authors: Khalid A. Al-Afandy, El-Sayyed El-Rabaie, Osama Salah, Ahmed El-Mhalaway

Abstract:

This paper presents a high secure data hiding technique using image cropping and Least Significant Bit (LSB) steganography. The predefined certain secret coordinate crops will be extracted from the cover image. The secret text message will be divided into sections. These sections quantity is equal the image crops quantity. Each section from the secret text message will embed into an image crop with a secret sequence using LSB technique. The embedding is done using the cover image color channels. Stego image is given by reassembling the image and the stego crops. The results of the technique will be compared to the other state of art techniques. Evaluation is based on visualization to detect any degradation of stego image, the difficulty of extracting the embedded data by any unauthorized viewer, Peak Signal-to-Noise Ratio of stego image (PSNR), and the embedding algorithm CPU time. Experimental results ensure that the proposed technique is more secure compared with the other traditional techniques.

Keywords: Steganography, stego, LSB, crop.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1496
493 A Multiple Beam LTE Base Station Antenna with Simultaneous Vertical and Horizontal Sectorization

Authors: Mohamed Sanad, Noha Hassan

Abstract:

A low wind-load light-weight broad-band multi-beam base station antenna has been developed. It can generate any required number of beams with the required beamwidths. It can have horizontal and vertical sectorization at the same time. Vertical sectorization doubles the overall number of beams. It will be very valuable in LTE-A and 5G. It can be used to serve vertically split inner and outer cells, which improves system performance. The intersection between the beams of the proposed multi-beam antenna can be controlled by optimizing the design parameters of the antenna. The gain at the points of intersection between the beams, the null filling and the overlap between the beams can all be modified. The proposed multi-beam base station antenna can cover an unlimited number of wireless applications, regardless of their frequency bands. It can simultaneously cover all, current and future, wireless technology generations such as 2G, 3G, 4G (LTE), --- etc. For example, in LTE, it covers the bands 450-470 MHz, 690-960 MHz, 1.4-2.7 GHz and 3.3-3.8 GHz. It has at least 2 ports for each band in each beam for ±45° polarizations. It can include up to 72 ports or even more, which could facilitate any further needed capacity expansions.

Keywords: Base station antenna, multi-beam antenna, smart antenna, vertical sectorization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1958
492 A Shape Optimization Method in Viscous Flow Using Acoustic Velocity and Four-step Explicit Scheme

Authors: Yoichi Hikino, Mutsuto Kawahara

Abstract:

The purpose of this study is to derive optimal shapes of a body located in viscous flows by the finite element method using the acoustic velocity and the four-step explicit scheme. The formulation is based on an optimal control theory in which a performance function of the fluid force is introduced. The performance function should be minimized satisfying the state equation. This problem can be transformed into the minimization problem without constraint conditions by using the adjoint equation with adjoint variables corresponding to the state equation. The performance function is defined by the drag and lift forces acting on the body. The weighted gradient method is applied as a minimization technique, the Galerkin finite element method is used as a spatial discretization and the four-step explicit scheme is used as a temporal discretization to solve the state equation and the adjoint equation. As the interpolation, the orthogonal basis bubble function for velocity and the linear function for pressure are employed. In case that the orthogonal basis bubble function is used, the mass matrix can be diagonalized without any artificial centralization. The shape optimization is performed by the presented method.

Keywords: Shape Optimization, Optimal Control Theory, Finite Element Method, Weighted Gradient Method, Fluid Force, Orthogonal Basis Bubble Function, Four-step Explicit Scheme, Acoustic Velocity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1423
491 Image Adaptive Watermarking with Visual Model in Orthogonal Polynomials based Transformation Domain

Authors: Krishnamoorthi R., Sheba Kezia Malarchelvi P. D.

Abstract:

In this paper, an image adaptive, invisible digital watermarking algorithm with Orthogonal Polynomials based Transformation (OPT) is proposed, for copyright protection of digital images. The proposed algorithm utilizes a visual model to determine the watermarking strength necessary to invisibly embed the watermark in the mid frequency AC coefficients of the cover image, chosen with a secret key. The visual model is designed to generate a Just Noticeable Distortion mask (JND) by analyzing the low level image characteristics such as textures, edges and luminance of the cover image in the orthogonal polynomials based transformation domain. Since the secret key is required for both embedding and extraction of watermark, it is not possible for an unauthorized user to extract the embedded watermark. The proposed scheme is robust to common image processing distortions like filtering, JPEG compression and additive noise. Experimental results show that the quality of OPT domain watermarked images is better than its DCT counterpart.

Keywords: Orthogonal Polynomials based Transformation, Digital Watermarking, Copyright Protection, Visual model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
490 Geometry Design Supported by Minimizing and Visualizing Collision in Dynamic Packing

Authors: Johan Segeborn, Johan S. Carlson, Robert Bohlin, Rikard Söderberg

Abstract:

This paper presents a method to support dynamic packing in cases when no collision-free path can be found. The method, which is primarily based on path planning and shrinking of geometries, suggests a minimal geometry design change that results in a collision-free assembly path. A supplementing approach to optimize geometry design change with respect to redesign cost is described. Supporting this dynamic packing method, a new method to shrink geometry based on vertex translation, interweaved with retriangulation, is suggested. The shrinking method requires neither tetrahedralization nor calculation of medial axis and it preserves the topology of the geometry, i.e. holes are neither lost nor introduced. The proposed methods are successfully applied on industrial geometries.

Keywords: Dynamic packing, path planning, shrinking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1338
489 Features of Soil Formation in the North of Western Siberia in Cryogenic Conditions

Authors: Tatiana V. Raudina, Sergey P. Kulizhskiy

Abstract:

A large part of Russia is located in permafrost areas. These areas are widely used because there are concentrated valuable natural resources. Therefore to explore of cryosols it is important due to the significant increase of anthropogenic stress as well as the problem of global climate change. In the north of Western Siberia permafrost phenomena is widespread. Permafrost as a factor of soil formation and cryogenesis as a process have a great impact on the soil formation of these areas. Based on the research results of permafrost-affected soils tundra landscapes formed in the central part of the Tazovskiy Peninsula in cryogenic conditions, data were obtained which characterize the morphological features of soils. The specificity of soil cover distribution and manifestation of soil-forming processes within the study area are noted. Permafrost features such as frost cracking, cryoturbation, thixotropy, movement of humus are formed. The formation of these features is increased with the development of the territory. As a consequence, there is a change in the components of the environment and the destruction of the soil cover.

Keywords: Gleyed and nongleyed soils, permafrost, soil cryogenesis (pedocryogenesis), soil-forming macroprocesses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2015
488 Enhancement Effect of Superparamagnetic Iron Oxide Nanoparticle-Based MRI Contrast Agent at Different Concentrations and Magnetic Field Strengths

Authors: Bimali Sanjeevani Weerakoon, Toshiaki Osuga, Takehisa Konishi

Abstract:

Magnetic Resonance Imaging Contrast Agents (MRI-CM) are significant in the clinical and biological imaging as they have the ability to alter the normal tissue contrast, thereby affecting the signal intensity to enhance the visibility and detectability of images. Superparamagnetic Iron Oxide (SPIO) nanoparticles, coated with dextran or carboxydextran are currently available for clinical MR imaging of the liver. Most SPIO contrast agents are T2 shortening agents and Resovist (Ferucarbotran) is one of a clinically tested, organ-specific, SPIO agent which has a low molecular carboxydextran coating. The enhancement effect of Resovist depends on its relaxivity which in turn depends on factors like magnetic field strength, concentrations, nanoparticle properties, pH and temperature. Therefore, this study was conducted to investigate the impact of field strength and different contrast concentrations on enhancement effects of Resovist. The study explored the MRI signal intensity of Resovist in the physiological range of plasma from T2-weighted spin echo sequence at three magnetic field strengths: 0.47 T (r1=15, r2=101), 1.5 T (r1=7.4, r2=95), and 3 T (r1=3.3, r2=160) and the range of contrast concentrations by a mathematical simulation. Relaxivities of r1 and r2 (L mmol-1 Sec-1) were obtained from a previous study and the selected concentrations were 0.05, 0.06, 0.07, 0.08, 0.09, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 2.0, and 3.0 mmol/L. T2-weighted images were simulated using TR/TE ratio as 2000 ms /100 ms. According to the reference literature, with increasing magnetic field strengths, the r1 relaxivity tends to decrease while the r2 did not show any systematic relationship with the selected field strengths. In parallel, this study results revealed that the signal intensity of Resovist at lower concentrations tends to increase than the higher concentrations. The highest reported signal intensity was observed in the low field strength of 0.47 T. The maximum signal intensities for 0.47 T, 1.5 T and 3 T were found at the concentration levels of 0.05, 0.06 and 0.05 mmol/L, respectively. Furthermore, it was revealed that, the concentrations higher than the above, the signal intensity was decreased exponentially. An inverse relationship can be found between the field strength and T2 relaxation time, whereas, the field strength was increased, T2 relaxation time was decreased accordingly. However, resulted T2 relaxation time was not significantly different between 0.47 T and 1.5 T in this study. Moreover, a linear correlation of transverse relaxation rates (1/T2, s–1) with the concentrations of Resovist can be observed. According to these results, it can conclude that the concentration of SPIO nanoparticle contrast agents and the field strengths of MRI are two important parameters which can affect the signal intensity of T2-weighted SE sequence. Therefore, when MR imaging those two parameters should be considered prudently.

Keywords: Concentration, Resovist, Field strength, Relaxivity, Signal intensity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1949
487 Optimal Duty-Cycle Modulation Scheme for Analog-To-Digital Conversion Systems

Authors: G. Sonfack, J. Mbihi, B. Lonla Moffo

Abstract:

This paper presents an optimal duty-cycle modulation (ODCM) scheme for analog-to-digital conversion (ADC) systems. The overall ODCM-Based ADC problem is decoupled into optimal DCM and digital filtering sub-problems, while taking into account constraints of mutual design parameters between the two. Using a set of three lemmas and four morphological theorems, the ODCM sub-problem is modelled as a nonlinear cost function with nonlinear constraints. Then, a weighted least pth norm of the error between ideal and predicted frequency responses is used as a cost function for the digital filtering sub-problem. In addition, MATLAB fmincon and MATLAB iirlnorm tools are used as optimal DCM and least pth norm solvers respectively. Furthermore, the virtual simulation scheme of an overall prototyping ODCM-based ADC system is implemented and well tested with the help of Simulink tool according to relevant set of design data, i.e., 3 KHz of modulating bandwidth, 172 KHz of maximum modulation frequency and 25 MHZ of sampling frequency. Finally, the results obtained and presented show that the ODCM-based ADC achieves under 3 KHz of modulating bandwidth: 57 dBc of SINAD (signal-to-noise and distorsion), 58 dB of SFDR (Surpious free dynamic range) -80 dBc of THD (total harmonic distorsion), and 10 bits of minimum resolution. These performance levels appear to be a great challenge within the class of oversampling ADC topologies, with 2nd order IIR (infinite impulse response) decimation filter.

Keywords: Digital IIR filter, morphological lemmas and theorems, optimal DCM-based DAC, virtual simulation, weighted least pth norm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 875
486 Spanning Tree Transformation of Connected Graphs into Single-Row Networks

Authors: S.L. Loh, S. Salleh, N.H. Sarmin

Abstract:

A spanning tree of a connected graph is a tree which consists the set of vertices and some or perhaps all of the edges from the connected graph. In this paper, a model for spanning tree transformation of connected graphs into single-row networks, namely Spanning Tree of Connected Graph Modeling (STCGM) will be introduced. Path-Growing Tree-Forming algorithm applied with Vertex-Prioritized is contained in the model to produce the spanning tree from the connected graph. Paths are produced by Path-Growing and they are combined into a spanning tree by Tree-Forming. The spanning tree that is produced from the connected graph is then transformed into single-row network using Tree Sequence Modeling (TSM). Finally, the single-row routing problem is solved using a method called Enhanced Simulated Annealing for Single-Row Routing (ESSR).

Keywords: Graph theory, simulated annealing, single-rowrouting and spanning tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1689
485 A Hybridized Competency-Based Teacher Candidate Selection System

Authors: R. Ramli, M. I. Ghazali, H. Ibrahim, M. M. Kasim, F. M. Kamal, S.Vikneswari

Abstract:

Teachers form the backbone of any educational system, hence selecting qualified candidates is very crucial. In Malaysia, the decision making in the selection process involves a few stages: Initial filtering through academic achievement, taking entry examination and going through an interview session. The last stage is the most challenging since it highly depends on human judgment. Therefore, this study sought to identify the selection criteria for teacher candidates that form the basis for an efficient multi-criteria teacher-candidate selection model for that last stage. The relevant criteria were determined from the literature and also based on expert input that is those who were involved in interviewing teacher candidates from a public university offering the formal training program. There are three main competency criteria that were identified which are content of knowledge, communication skills and personality. Further, each main criterion was divided into a few subcriteria. The Analytical Hierarchy Process (AHP) technique was employed to allocate weights for the criteria and later, integrated a Simple Weighted Average (SWA) scoring approach to develop the selection model. Subsequently, a web-based Decision Support System was developed to assist in the process of selecting the qualified teacher candidates. The Teacher-Candidate Selection (TeCaS) system is able to assist the panel of interviewers during the selection process which involves a large amount of complex qualitative judgments.

Keywords: Analytic Hierarchy Process, Simple Weighted Average, Decision Support System, Multi-criteria decision making problem.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2138
484 Evaluation of Energy Upgrade Measures and Connection of Renewable Energy Sources Using Software Tools: Case Study of an Academic Library Building in Larissa, Greece

Authors: Giwrgos S. Gkarmpounis, Aikaterini G. Rokkou, Marios N. Moschakis

Abstract:

Increased energy consumption in the academic buildings, creates the need to implement energy saving measures and to take advantage of the renewable energy sources to cover the electrical needs of those buildings. An Academic Library will be used as a case study. With the aid of RETScreen software that takes into account the energy consumptions and characteristics of the Library Building, it is proved that measures such as the replacement of fluorescent lights with led lights, the installation of outdoor shading, the replacement of the openings and Building Management System installation, provide a high level of energy savings. Moreover, given the available space of the building and the climatic data, the installation of a photovoltaic system of 100 kW can also cover a serious amount of the building energy consumption, unlike a wind system that seems uncompromising. Lastly, HOMER software is used to compare the use of a photovoltaic system against a wind system in order to verify the results that came up from the RETScreen software concerning the renewable energy sources.

Keywords: Energy saving measures, homer software, renewable energy sources, RETScreen software, energy efficiency and quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 920
483 Scatterer Density in Edge and Coherence Enhancing Nonlinear Anisotropic Diffusion for Medical Ultrasound Speckle Reduction

Authors: Ahmed Badawi, J. Michael Johnson, Mohamed Mahfouz

Abstract:

This paper proposes new enhancement models to the methods of nonlinear anisotropic diffusion to greatly reduce speckle and preserve image features in medical ultrasound images. By incorporating local physical characteristics of the image, in this case scatterer density, in addition to the gradient, into existing tensorbased image diffusion methods, we were able to greatly improve the performance of the existing filtering methods, namely edge enhancing (EE) and coherence enhancing (CE) diffusion. The new enhancement methods were tested using various ultrasound images, including phantom and some clinical images, to determine the amount of speckle reduction, edge, and coherence enhancements. Scatterer density weighted nonlinear anisotropic diffusion (SDWNAD) for ultrasound images consistently outperformed its traditional tensor-based counterparts that use gradient only to weight the diffusivity function. SDWNAD is shown to greatly reduce speckle noise while preserving image features as edges, orientation coherence, and scatterer density. SDWNAD superior performances over nonlinear coherent diffusion (NCD), speckle reducing anisotropic diffusion (SRAD), adaptive weighted median filter (AWMF), wavelet shrinkage (WS), and wavelet shrinkage with contrast enhancement (WSCE), make these methods ideal preprocessing steps for automatic segmentation in ultrasound imaging.

Keywords: Nonlinear anisotropic diffusion, ultrasound imaging, speckle reduction, scatterer density estimation, edge based enhancement, coherence enhancement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1864
482 Ecosystem Post-Wildfire Effects of Thasos Island

Authors: George D. Ranis, Valasia Iakovoglou, George N. Zaimes

Abstract:

Fires is one of the main types of disturbances that shape ecosystems in the Mediterranean region. However nowadays, climate alterations towards higher temperatures result on increased levels of fire intensity, frequency and spread as well as difficulties for natural regeneration to occur. Thasos Island is one of the Greek islands that has experienced those problems. Since 1984, a series of wildfires led to the reduction of forest cover from 61.6% to almost 20%. The negative impacts were devastating in many different aspects for the island. The absence of plant cover, post-wildfire precipitation and steep slopes were the major factors that induced severe soil erosion and intense floods. That also resulted to serious economic problems to the local communities and the inability of the burnt areas to regenerate naturally. Despite the substantial amount of published work regarding Thasos wildfires, there is no information related to post-wildfire effects on factors such as soil erosion. More research related to post-fire effects should help to an overall assessment of the negative impacts of wildfires on land degradation through processes such as soil erosion and flooding.

Keywords: Erosion, land degradation, Mediterranean islands, regeneration, Thasos, wildfires.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2218
481 Evaluation of a Bio-Mechanism by Graphed Static Equilibrium Forces

Authors: A.Y. Bani Hashim, N.A. Abu Osman, W.A.B. Wan Abas, L. Abdul Latif

Abstract:

The unique structural configuration found in human foot allows easy walking. Similar movement is hard to imitate even for an ape. It is obvious that human ambulation relates to the foot structure itself. Suppose the bones are represented as vertices and the joints as edges. This leads to the development of a special graph that represents human foot. On a footprint there are point-ofcontacts which have contact with the ground. It involves specific vertices. Theoretically, for an ideal ambulation, these points provide reactions onto the ground or the static equilibrium forces. They are arranged in sequence in form of a path. The ambulating footprint follows this path. Having the human foot graph and the path crossbred, it results in a representation that describes the profile of an ideal ambulation. This profile cites the locations where the point-of-contact experience normal reaction forces. It highlights the significant of these points.

Keywords: Ambulation, edge, foot, graph, vertex.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1103