Search results for: Constrained discrete combinatorial choice
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1226

Search results for: Constrained discrete combinatorial choice

956 On the Prediction of Transmembrane Helical Segments in Membrane Proteins Based on Wavelet Transform

Authors: Yu Bin, Zhang Yan

Abstract:

The prediction of transmembrane helical segments (TMHs) in membrane proteins is an important field in the bioinformatics research. In this paper, a new method based on discrete wavelet transform (DWT) has been developed to predict the number and location of TMHs in membrane proteins. PDB coded as 1KQG was chosen as an example to describe the prediction of the number and location of TMHs in membrane proteins by using this method. To access the effect of the method, 80 proteins with known 3D-structure from Mptopo database are chosen at random as the test objects (including 325 TMHs), 308 of which can be predicted accurately, the average predicted accuracy is 96.3%. In addition, the above 80 membrane proteins are divided into 13 groups according to their function and type. In particular, the results of the prediction of TMHs of the 13 groups are satisfying.

Keywords: discrete wavelet transform, hydrophobicity, membrane protein, transmembrane helical segments

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1371
955 Optimization Using Simulation of the Vehicle Routing Problem

Authors: Nayera E. El-Gharably, Khaled S. El-Kilany, Aziz E. El-Sayed

Abstract:

A key element of many distribution systems is the routing and scheduling of vehicles servicing a set of customers. A wide variety of exact and approximate algorithms have been proposed for solving the vehicle routing problems (VRP). Exact algorithms can only solve relatively small problems of VRP, which is classified as NP-Hard. Several approximate algorithms have proven successful in finding a feasible solution not necessarily optimum. Although different parts of the problem are stochastic in nature; yet, limited work relevant to the application of discrete event system simulation has addressed the problem. Presented here is optimization using simulation of VRP; where, a simplified problem has been developed in the ExtendSimTM simulation environment; where, ExtendSimTM evolutionary optimizer is used to minimize the total transportation cost of the problem. Results obtained from the model are very satisfactory. Further complexities of the problem are proposed for consideration in the future.

Keywords: Discrete event system simulation, optimization using simulation, vehicle routing problem.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5792
954 Robust and Transparent Spread Spectrum Audio Watermarking

Authors: Ali Akbar Attari, Ali Asghar Beheshti Shirazi

Abstract:

In this paper, we propose a blind and robust audio watermarking scheme based on spread spectrum in Discrete Wavelet Transform (DWT) domain. Watermarks are embedded in the low-frequency coefficients, which is less audible. The key idea is dividing the audio signal into small frames, and magnitude of the 6th level of DWT approximation coefficients is modifying based upon the Direct Sequence Spread Spectrum (DSSS) technique. Also, the psychoacoustic model for enhancing in imperceptibility, as well as Savitsky-Golay filter for increasing accuracy in extraction, is used. The experimental results illustrate high robustness against most common attacks, i.e. Gaussian noise addition, Low pass filter, Resampling, Requantizing, MP3 compression, without significant perceptual distortion (ODG is higher than -1). The proposed scheme has about 83 bps data payload.

Keywords: Audio watermarking, spread spectrum, discrete wavelet transform, psychoacoustic, Savitsky-Golay filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 806
953 Contourlet versus Wavelet Transform for a Robust Digital Image Watermarking Technique

Authors: Ibrahim A. El rube, Mohamad Abou El Nasr , Mostafa M. Naim, Mahmoud Farouk

Abstract:

In this paper, a watermarking algorithm that uses the wavelet transform with Multiple Description Coding (MDC) and Quantization Index Modulation (QIM) concepts is introduced. Also, the paper investigates the role of Contourlet Transform (CT) versus Wavelet Transform (WT) in providing robust image watermarking. Two measures are utilized in the comparison between the waveletbased and the contourlet-based methods; Peak Signal to Noise Ratio (PSNR) and Normalized Cross-Correlation (NCC). Experimental results reveal that the introduced algorithm is robust against different attacks and has good results compared to the contourlet-based algorithm.

Keywords: image watermarking; discrete wavelet transform, discrete contourlet transform, multiple description coding, quantization index modulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2019
952 Blueprinting of a Normalized Supply Chain Processes: Results in Implementing Normalized Software Systems

Authors: Bassam Istanbouli

Abstract:

With the technology evolving every day and with the increase in global competition, industries are always under the pressure to be the best. They need to provide good quality products at competitive prices, when and how the customer wants them.  In order to achieve this level of service, products and their respective supply chain processes need to be flexible and evolvable; otherwise changes will be extremely expensive, slow and with many combinatorial effects. Those combinatorial effects impact the whole organizational structure, from a management, financial, documentation, logistics and specially the information system Enterprise Requirement Planning (ERP) perspective. By applying the normalized system concept/theory to segments of the supply chain, we believe minimal effects, especially at the time of launching an organization global software project. The purpose of this paper is to point out that if an organization wants to develop a software from scratch or implement an existing ERP software for their business needs and if their business processes are normalized and modular then most probably this will yield to a normalized and modular software system that can be easily modified when the business evolves. Another important goal of this paper is to increase the awareness regarding the design of the business processes in a software implementation project. If the blueprints created are normalized then the software developers and configurators will use those modular blueprints to map them into modular software. This paper only prepares the ground for further studies;  the above concept will be supported by going through the steps of developing, configuring and/or implementing a software system for an organization by using two methods: The Software Development Lifecycle method (SDLC) and the Accelerated SAP implementation method (ASAP). Both methods start with the customer requirements, then blue printing of its business processes and finally mapping those processes into a software system.  Since those requirements and processes are the starting point of the implementation process, then normalizing those processes will end up in a normalizing software.

Keywords: Blueprint, ERP, SDLC, Modular.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 343
951 Walsh-Hadamard Transform for Facial Feature Extraction in Face Recognition

Authors: M. Hassan, I. Osman, M. Yahia

Abstract:

This Paper proposes a new facial feature extraction approach, Wash-Hadamard Transform (WHT). This approach is based on correlation between local pixels of the face image. Its primary advantage is the simplicity of its computation. The paper compares the proposed approach, WHT, which was traditionally used in data compression with two other known approaches: the Principal Component Analysis (PCA) and the Discrete Cosine Transform (DCT) using the face database of Olivetti Research Laboratory (ORL). In spite of its simple computation, the proposed algorithm (WHT) gave very close results to those obtained by the PCA and DCT. This paper initiates the research into WHT and the family of frequency transforms and examines their suitability for feature extraction in face recognition applications.

Keywords: Face Recognition, Facial Feature Extraction, Principal Component Analysis, and Discrete Cosine Transform, Wash-Hadamard Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2529
950 Modeling a Multinomial Logit Model of Intercity Travel Mode Choice Behavior for All Trips in Libya

Authors: Manssour A. Abdulsalam Bin Miskeen, Ahmed Mohamed Alhodairi, Riza Atiq Abdullah Bin O. K. Rahmat

Abstract:

In the planning point of view, it is essential to have mode choice, due to the massive amount of incurred in transportation systems. The intercity travellers in Libya have distinct features, as against travellers from other countries, which includes cultural and socioeconomic factors. Consequently, the goal of this study is to recognize the behavior of intercity travel using disaggregate models, for projecting the demand of nation-level intercity travel in Libya. Multinomial Logit Model for all the intercity trips has been formulated to examine the national-level intercity transportation in Libya. The Multinomial logit model was calibrated using nationwide revealed preferences (RP) and stated preferences (SP) survey. The model was developed for deference purpose of intercity trips (work, social and recreational). The variables of the model have been predicted based on maximum likelihood method. The data needed for model development were obtained from all major intercity corridors in Libya. The final sample size consisted of 1300 interviews. About two-thirds of these data were used for model calibration, and the remaining parts were used for model validation. This study, which is the first of its kind in Libya, investigates the intercity traveler’s mode-choice behavior. The intercity travel mode-choice model was successfully calibrated and validated. The outcomes indicate that, the overall model is effective and yields higher precision of estimation. The proposed model is beneficial, due to the fact that, it is receptive to a lot of variables, and can be employed to determine the impact of modifications in the numerous characteristics on the need for various travel modes. Estimations of the model might also be of valuable to planners, who can estimate possibilities for various modes and determine the impact of unique policy modifications on the need for intercity travel.

Keywords: Multinomial logit model, improved intercity transport, intercity mode-choice behavior, disaggregate analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7785
949 Statistical Wavelet Features, PCA, and SVM Based Approach for EEG Signals Classification

Authors: R. K. Chaurasiya, N. D. Londhe, S. Ghosh

Abstract:

The study of the electrical signals produced by neural activities of human brain is called Electroencephalography. In this paper, we propose an automatic and efficient EEG signal classification approach. The proposed approach is used to classify the EEG signal into two classes: epileptic seizure or not. In the proposed approach, we start with extracting the features by applying Discrete Wavelet Transform (DWT) in order to decompose the EEG signals into sub-bands. These features, extracted from details and approximation coefficients of DWT sub-bands, are used as input to Principal Component Analysis (PCA). The classification is based on reducing the feature dimension using PCA and deriving the supportvectors using Support Vector Machine (SVM). The experimental are performed on real and standard dataset. A very high level of classification accuracy is obtained in the result of classification.

Keywords: Discrete Wavelet Transform, Electroencephalogram, Pattern Recognition, Principal Component Analysis, Support Vector Machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3066
948 A New Heuristic Approach for the Stock- Cutting Problems

Authors: Stephen C. H. Leung, Defu Zhang

Abstract:

This paper addresses a stock-cutting problem with rotation of items and without the guillotine cutting constraint. In order to solve the large-scale problem effectively and efficiently, we propose a simple but fast heuristic algorithm. It is shown that this heuristic outperforms the latest published algorithms for large-scale problem instances.

Keywords: Combinatorial optimization, heuristic, large-scale, stock-cutting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636
947 400 kW Six Analytical High Speed Generator Designs for Smart Grid Systems

Authors: A. El Shahat, A. Keyhani, H. El Shewy

Abstract:

High Speed PM Generators driven by micro-turbines are widely used in Smart Grid System. So, this paper proposes comparative study among six classical, optimized and genetic analytical design cases for 400 kW output power at tip speed 200 m/s. These six design trials of High Speed Permanent Magnet Synchronous Generators (HSPMSGs) are: Classical Sizing; Unconstrained optimization for total losses and its minimization; Constrained optimized total mass with bounded constraints are introduced in the problem formulation. Then a genetic algorithm is formulated for obtaining maximum efficiency and minimizing machine size. In the second genetic problem formulation, we attempt to obtain minimum mass, the machine sizing that is constrained by the non-linear constraint function of machine losses. Finally, an optimum torque per ampere genetic sizing is predicted. All results are simulated with MATLAB, Optimization Toolbox and its Genetic Algorithm. Finally, six analytical design examples comparisons are introduced with study of machines waveforms, THD and rotor losses.

Keywords: High Speed, Micro - Turbines, Optimization, PM Generators, Smart Grid, MATLAB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2408
946 Enhancement of Low Contrast Satellite Images using Discrete Cosine Transform and Singular Value Decomposition

Authors: A. K. Bhandari, A. Kumar, P. K. Padhy

Abstract:

In this paper, a novel contrast enhancement technique for contrast enhancement of a low-contrast satellite image has been proposed based on the singular value decomposition (SVD) and discrete cosine transform (DCT). The singular value matrix represents the intensity information of the given image and any change on the singular values change the intensity of the input image. The proposed technique converts the image into the SVD-DCT domain and after normalizing the singular value matrix; the enhanced image is reconstructed by using inverse DCT. The visual and quantitative results suggest that the proposed SVD-DCT method clearly shows the increased efficiency and flexibility of the proposed method over the exiting methods such as Linear Contrast Stretching technique, GHE technique, DWT-SVD technique, DWT technique, Decorrelation Stretching technique, Gamma Correction method based techniques.

Keywords: Singular Value Decomposition (SVD), discretecosine transforms (DCT), image equalization and satellite imagecontrast enhancement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3791
945 A Secure Semi-Fragile Watermarking Scheme for Authentication and Recovery of Images Based On Wavelet Transform

Authors: Rafiullah Chamlawi, Asifullah Khan, Adnan Idris, Zahid Munir

Abstract:

Authentication of multimedia contents has gained much attention in recent times. In this paper, we propose a secure semi-fragile watermarking, with a choice of two watermarks to be embedded. This technique operates in integer wavelet domain and makes use of semi fragile watermarks for achieving better robustness. A self-recovering algorithm is employed, that hides the image digest into some Wavelet subbands to detect possible malevolent object manipulation undergone by the image (object replacing and/or deletion). The Semi-fragility makes the scheme tolerant for JPEG lossy compression as low as quality of 70%, and locate the tempered area accurately. In addition, the system ensures more security because the embedded watermarks are protected with private keys. The computational complexity is reduced using parameterized integer wavelet transform. Experimental results show that the proposed scheme guarantees the safety of watermark, image recovery and location of the tempered area accurately.

Keywords: Integer Wavelet Transform (IWT), Discrete Cosine Transform (DCT), JPEG Compression, Authentication and Self- Recovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035
944 A Budget and Deadline Constrained Fault Tolerant Load Balanced Scheduling Algorithm for Computational Grids

Authors: P. Keerthika, P. Suresh

Abstract:

Grid is an environment with millions of resources which are dynamic and heterogeneous in nature. A computational grid is one in which the resources are computing nodes and is meant for applications that involves larger computations. A scheduling algorithm is said to be efficient if and only if it performs better resource allocation even in case of resource failure. Resource allocation is a tedious issue since it has to consider several requirements such as system load, processing cost and time, user’s deadline and resource failure. This work attempts in designing a resource allocation algorithm which is cost-effective and also targets at load balancing, fault tolerance and user satisfaction by considering the above requirements. The proposed Budget Constrained Load Balancing Fault Tolerant algorithm with user satisfaction (BLBFT) reduces the schedule makespan, schedule cost and task failure rate and improves resource utilization. Evaluation of the proposed BLBFT algorithm is done using Gridsim toolkit and the results are compared with the algorithms which separately concentrates on all these factors. The comparison results ensure that the proposed algorithm works better than its counterparts.

Keywords: Grid Scheduling, Load Balancing, fault tolerance, makespan, cost, resource utilization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2096
943 Network-Constrained AC Unit Commitment under Uncertainty Using a Bender’s Decomposition Approach

Authors: B. Janani, S. Thiruvenkadam

Abstract:

In this work, the system evaluates the impact of considering a stochastic approach on the day ahead basis Unit Commitment. Comparisons between stochastic and deterministic Unit Commitment solutions are provided. The Unit Commitment model consists in the minimization of the total operation costs considering unit’s technical constraints like ramping rates, minimum up and down time. Load shedding and wind power spilling is acceptable, but at inflated operational costs. The evaluation process consists in the calculation of the optimal unit commitment and in verifying the fulfillment of the considered constraints. For the calculation of the optimal unit commitment, an algorithm based on the Benders Decomposition, namely on the Dual Dynamic Programming, was developed. Two approaches were considered on the construction of stochastic solutions. Data related to wind power outputs from two different operational days are considered on the analysis. Stochastic and deterministic solutions are compared based on the actual measured wind power output at the operational day. Through a technique capability of finding representative wind power scenarios and its probabilities, the system can analyze a more detailed process about the expected final operational cost.

Keywords: Benders’ decomposition, network constrained AC unit commitment, stochastic programming, wind power uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1270
942 A Discrete Event Simulation Model to Manage Bed Usage for Non-Elective Admissions in a Geriatric Medicine Speciality

Authors: Muhammed Ordu, Eren Demir, Chris Tofallis

Abstract:

Over the past decade, the non-elective admissions in the UK have increased significantly. Taking into account limited resources (i.e. beds), the related service managers are obliged to manage their resources effectively due to the non-elective admissions which are mostly admitted to inpatient specialities via A&E departments. Geriatric medicine is one of specialities that have long length of stay for the non-elective admissions. This study aims to develop a discrete event simulation model to understand how possible increases on non-elective demand over the next 12 months affect the bed occupancy rate and to determine required number of beds in a geriatric medicine speciality in a UK hospital. In our validated simulation model, we take into account observed frequency distributions which are derived from a big data covering the period April, 2009 to January, 2013, for the non-elective admission and the length of stay. An experimental analysis, which consists of 16 experiments, is carried out to better understand possible effects of case studies and scenarios related to increase on demand and number of bed. As a result, the speciality does not achieve the target level in the base model although the bed occupancy rate decreases from 125.94% to 96.41% by increasing the number of beds by 30%. In addition, the number of required beds is more than the number of beds considered in the scenario analysis in order to meet the bed requirement. This paper sheds light on bed management for service managers in geriatric medicine specialities.

Keywords: Bed management, bed occupancy rate, discrete event simulation, geriatric medicine, non-elective admission.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1848
941 A Decision Boundary based Discretization Technique using Resampling

Authors: Taimur Qureshi, Djamel A Zighed

Abstract:

Many supervised induction algorithms require discrete data, even while real data often comes in a discrete and continuous formats. Quality discretization of continuous attributes is an important problem that has effects on speed, accuracy and understandability of the induction models. Usually, discretization and other types of statistical processes are applied to subsets of the population as the entire population is practically inaccessible. For this reason we argue that the discretization performed on a sample of the population is only an estimate of the entire population. Most of the existing discretization methods, partition the attribute range into two or several intervals using a single or a set of cut points. In this paper, we introduce a technique by using resampling (such as bootstrap) to generate a set of candidate discretization points and thus, improving the discretization quality by providing a better estimation towards the entire population. Thus, the goal of this paper is to observe whether the resampling technique can lead to better discretization points, which opens up a new paradigm to construction of soft decision trees.

Keywords: Bootstrap, discretization, resampling, soft decision trees.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1391
940 Classification of Acoustic Emission Based Partial Discharge in Oil Pressboard Insulation System Using Wavelet Analysis

Authors: Prasanta Kundu, N.K. Kishore, A.K. Sinha

Abstract:

Insulation used in transformer is mostly oil pressboard insulation. Insulation failure is one of the major causes of catastrophic failure of transformers. It is established that partial discharges (PD) cause insulation degradation and premature failure of insulation. Online monitoring of PDs can reduce the risk of catastrophic failure of transformers. There are different techniques of partial discharge measurement like, electrical, optical, acoustic, opto-acoustic and ultra high frequency (UHF). Being non invasive and non interference prone, acoustic emission technique is advantageous for online PD measurement. Acoustic detection of p.d. is based on the retrieval and analysis of mechanical or pressure signals produced by partial discharges. Partial discharges are classified according to the origin of discharges. Their effects on insulation deterioration are different for different types. This paper reports experimental results and analysis for classification of partial discharges using acoustic emission signal of laboratory simulated partial discharges in oil pressboard insulation system using three different electrode systems. Acoustic emission signal produced by PD are detected by sensors mounted on the experimental tank surface, stored on an oscilloscope and fed to computer for further analysis. The measured AE signals are analyzed using discrete wavelet transform analysis and wavelet packet analysis. Energy distribution in different frequency bands of discrete wavelet decomposed signal and wavelet packet decomposed signal is calculated. These analyses show a distinct feature useful for PD classification. Wavelet packet analysis can sort out any misclassification arising out of DWT in most cases.

Keywords: Acoustic emission, discrete wavelet transform, partial discharge, wavelet packet analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2939
939 Bifurcation Analysis of a Plankton Model with Discrete Delay

Authors: Anuj Kumar Sharma, Amit Sharma, Kulbhushan Agnihotri

Abstract:

In this paper, a delayed plankton-nutrient interaction model consisting of phytoplankton, zooplankton and dissolved nutrient is considered. It is assumed that some species of phytoplankton releases toxin (known as toxin producing phytoplankton (TPP)) which is harmful for zooplankton growth and this toxin releasing process follows a discrete time variation. Using delay as bifurcation parameter, the stability of interior equilibrium point is investigated and it is shown that time delay can destabilize the otherwise stable non-zero equilibrium state by inducing Hopf-bifurcation when it crosses a certain threshold value. Explicit results are derived for stability and direction of the bifurcating periodic solution by using normal form theory and center manifold arguments. Finally, outcomes of the system are validated through numerical simulations.

Keywords: Plankton, Time delay, Hopf-bifurcation, Normal form theory, Center manifold theorem.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1862
938 Combine Duration and "Select the Priority Trip" to Improve the Number of Boats

Authors: Liu Shu, Dong Shangjia

Abstract:

Our goal is to effectively increase the number of boats in the river during a six month period. The main factors of determining the number of boats are duration and “select the priority trip". In the microcosmic simulation model, the best result is 4 to 24 nights with DSCF, and the number of boats is 812 with an increasing ratio of 9.0% related to the second best result. However, the number of boats is related to 31.6% less than the best one in 6 to 18 nights with FCFS. In the discrete duration model, we get from 6 to 18 nights, the numbers of boats have increased to 848 with an increase ratio of 29.7% than the best result in model I for the same time range. Moreover, from 4 to 24 nights, the numbers of boats have increase to 1194 with an increase ratio of 47.0% than the best result in model I for the same time range.

Keywords: Discrete duration model, “select the priority trip”, microcosmic simulation model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1133
937 A Ground Structure Method to Minimize the Total Installed Cost of Steel Frame Structures

Authors: Filippo Ranalli, Forest Flager, Martin Fischer

Abstract:

This paper presents a ground structure method to optimize the topology and discrete member sizing of steel frame structures in order to minimize total installed cost, including material, fabrication and erection components. The proposed method improves upon existing cost-based ground structure methods by incorporating constructability considerations well as satisfying both strength and serviceability constraints. The architecture for the method is a bi-level Multidisciplinary Feasible (MDF) architecture in which the discrete member sizing optimization is nested within the topology optimization process. For each structural topology generated, the sizing optimization process seek to find a set of discrete member sizes that result in the lowest total installed cost while satisfying strength (member utilization) and serviceability (node deflection and story drift) criteria. To accurately assess cost, the connection details for the structure are generated automatically using accurate site-specific cost information obtained directly from fabricators and erectors. Member continuity rules are also applied to each node in the structure to improve constructability. The proposed optimization method is benchmarked against conventional weight-based ground structure optimization methods resulting in an average cost savings of up to 30% with comparable computational efficiency.

Keywords: Cost-based structural optimization, cost-based topology and sizing optimization, steel frame ground structure optimization, multidisciplinary optimization of steel structures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1340
936 Reliability Based Investigation on the Choice of Characteristic Soil Properties

Authors: Jann-Eike Saathoff, Kirill Alexander Schmoor, Martin Achmus, Mauricio Terceros

Abstract:

By using partial factors of safety, uncertainties due to the inherent variability of the soil properties and loads are taken into account in the geotechnical design process. According to the reliability index concept in Eurocode-0 in conjunction with Eurocode-7 a minimum safety level of β = 3.8 for reliability class RC2 shall be established. The reliability of the system depends heavily on the choice of the prespecified safety factor and the choice of the characteristic soil properties. The safety factors stated in the standards are mainly based on experience. However, no general accepted method for the calculation of a characteristic value within the current design practice exists. In this study, a laterally loaded monopile is investigated and the influence of the chosen quantile values of the deterministic system, calculated with p-y springs, will be presented. Monopiles are the most common foundation concepts for offshore wind energy converters. Based on the calculations for non-cohesive soils, a recommendation for an appropriate quantile value for the necessary safety level according to the standards for a deterministic design is given.

Keywords: Asymptotic sampling, characteristic value, monopile foundation, probabilistic design, quantile values.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 611
935 From e-Government to e-Democracy Challenges and Opportunities for Development in Montenegro

Authors: Tamara Djurickovic MSc

Abstract:

Internet today has a huge impact on all aspects of life, and also in the area of the broader context of democracy, politics and politicians. If democracy is freedom of choice, there are a number of conditions that can ensure in practice the freedom to be achieved and realized. These preconditions must be achieved regardless of the manner of voting. The key contribution of ICT to achieve freedom of choice is that technology enables the correlation of the citizens and elected representatives on the better way than it was possible without the Internet. In this sense, we can say that the Internet and ICT are changing significantly, and potentially improving the environment in which democratic processes are taking place. This paper aims to describe trends in use of ICT in democratic processes, and analyzes the challenges for implementation of e-Democracy in Montenegro

Keywords: About four key words or phrases in alphabetical order, separated by commas.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1569
934 Discrete Polynomial Moments and Savitzky-Golay Smoothing

Authors: Paul O'Leary, Matthew Harker

Abstract:

This paper presents unified theory for local (Savitzky- Golay) and global polynomial smoothing. The algebraic framework can represent any polynomial approximation and is seamless from low degree local, to high degree global approximations. The representation of the smoothing operator as a projection onto orthonormal basis functions enables the computation of: the covariance matrix for noise propagation through the filter; the noise gain and; the frequency response of the polynomial filters. A virtually perfect Gram polynomial basis is synthesized, whereby polynomials of degree d = 1000 can be synthesized without significant errors. The perfect basis ensures that the filters are strictly polynomial preserving. Given n points and a support length ls = 2m + 1 then the smoothing operator is strictly linear phase for the points xi, i = m+1. . . n-m. The method is demonstrated on geometric surfaces data lying on an invariant 2D lattice.

Keywords: Gram polynomials, Savitzky-Golay Smoothing, Discrete Polynomial Moments

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2726
933 Particle Swarm Optimization Algorithm vs. Genetic Algorithm for Image Watermarking Based Discrete Wavelet Transform

Authors: Omaima N. Ahmad AL-Allaf

Abstract:

Over communication networks, images can be easily copied and distributed in an illegal way. The copyright protection for authors and owners is necessary. Therefore, the digital watermarking techniques play an important role as a valid solution for authority problems. Digital image watermarking techniques are used to hide watermarks into images to achieve copyright protection and prevent its illegal copy. Watermarks need to be robust to attacks and maintain data quality. Therefore, we discussed in this paper two approaches for image watermarking, first is based on Particle Swarm Optimization (PSO) and the second approach is based on Genetic Algorithm (GA). Discrete wavelet transformation (DWT) is used with the two approaches separately for embedding process to cover image transformation. Each of PSO and GA is based on co-relation coefficient to detect the high energy coefficient watermark bit in the original image and then hide the watermark in original image. Many experiments were conducted for the two approaches with different values of PSO and GA parameters. From experiments, PSO approach got better results with PSNR equal 53, MSE equal 0.0039. Whereas GA approach got PSNR equal 50.5 and MSE equal 0.0048 when using population size equal to 100, number of iterations equal to 150 and 3×3 block. According to the results, we can note that small block size can affect the quality of image watermarking based PSO/GA because small block size can increase the search area of the watermarking image. Better PSO results were obtained when using swarm size equal to 100.

Keywords: Image watermarking, genetic algorithm, particle swarm optimization, discrete wavelet transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1105
932 Real-Time Image Encryption Using a 3D Discrete Dual Chaotic Cipher

Authors: M. F. Haroun, T. A. Gulliver

Abstract:

In this paper, an encryption algorithm is proposed for real-time image encryption. The scheme employs a dual chaotic generator based on a three dimensional (3D) discrete Lorenz attractor. Encryption is achieved using non-autonomous modulation where the data is injected into the dynamics of the master chaotic generator. The second generator is used to permute the dynamics of the master generator using the same approach. Since the data stream can be regarded as a random source, the resulting permutations of the generator dynamics greatly increase the security of the transmitted signal. In addition, a technique is proposed to mitigate the error propagation due to the finite precision arithmetic of digital hardware. In particular, truncation and rounding errors are eliminated by employing an integer representation of the data which can easily be implemented. The simple hardware architecture of the algorithm makes it suitable for secure real-time applications.

Keywords: Chaotic systems, image encryption, 3D Lorenz attractor, non-autonomous modulation, FPGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1174
931 Steering Velocity Bounded Mobile Robots in Environments with Partially Known Obstacles

Authors: Reza Hossseynie, Amir Jafari

Abstract:

This paper presents a method for steering velocity bounded mobile robots in environments with partially known stationary obstacles. The exact location of obstacles is unknown and only a probability distribution associated with the location of the obstacles is known. Kinematic model of a 2-wheeled differential drive robot is used as the model of mobile robot. The presented control strategy uses the Artificial Potential Field (APF) method for devising a desired direction of movement for the robot at each instant of time while the Constrained Directions Control (CDC) uses the generated direction to produce the control signals required for steering the robot. The location of each obstacle is considered to be the mean value of the 2D probability distribution and similarly, the magnitude of the electric charge in the APF is set as the trace of covariance matrix of the location probability distribution. The method not only captures the challenges of planning the path (i.e. probabilistic nature of the location of unknown obstacles), but it also addresses the output saturation which is considered to be an important issue from the control perspective. Moreover, velocity of the robot can be controlled during the steering. For example, the velocity of robot can be reduced in close vicinity of obstacles and target to ensure safety. Finally, the control strategy is simulated for different scenarios to show how the method can be put into practice.

Keywords: Steering, obstacle avoidance, mobile robots, constrained directions control, artificial potential field.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 864
930 Complex Dynamics of Bertrand Duopoly Games with Bounded Rationality

Authors: Jixiang Zhang, Guocheng Wang

Abstract:

A dynamic of Bertrand duopoly game is analyzed, where players use different production methods and choose their prices with bounded rationality. The equilibriums of the corresponding discrete dynamical systems are investigated. The stability conditions of Nash equilibrium under a local adjustment process are studied. The stability conditions of Nash equilibrium under a local adjustment process are studied. The stability of Nash equilibrium, as some parameters of the model are varied, gives rise to complex dynamics such as cycles of higher order and chaos. On this basis, we discover that an increase of adjustment speed of bounded rational player can make Bertrand market sink into the chaotic state. Finally, the complex dynamics, bifurcations and chaos are displayed by numerical simulation.

Keywords: Bertrand duopoly model, Discrete dynamical system, Heterogeneous expectations, Nash equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2550
929 Comparison between Haar and Daubechies Wavelet Transformions on FPGA Technology

Authors: Mohamed I. Mahmoud, Moawad I. M. Dessouky, Salah Deyab, Fatma H. Elfouly

Abstract:

Recently, the Field Programmable Gate Array (FPGA) technology offers the potential of designing high performance systems at low cost. The discrete wavelet transform has gained the reputation of being a very effective signal analysis tool for many practical applications. However, due to its computation-intensive nature, current implementation of the transform falls short of meeting real-time processing requirements of most application. The objectives of this paper are implement the Haar and Daubechies wavelets using FPGA technology. In addition, the comparison between the Haar and Daubechies wavelets is investigated. The Bit Error Rat (BER) between the input audio signal and the reconstructed output signal for each wavelet is calculated. It is seen that the BER using Daubechies wavelet techniques is less than Haar wavelet. The design procedure has been explained and designed using the stat-of-art Electronic Design Automation (EDA) tools for system design on FPGA. Simulation, synthesis and implementation on the FPGA target technology has been carried out.

Keywords: Daubechies wavelet, discrete wavelet transform, Haar wavelet, Xilinx FPGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4793
928 Attitude Stabilization of Satellites Using Random Dither Quantization

Authors: Attitude Stabilization of Satellites Using Random Dither Quantization

Abstract:

Recently, the effectiveness of random dither quantization method for linear feedback control systems has been shown in several papers. However, the random dither quantization method has not yet been applied to nonlinear feedback control systems. The objective of this paper is to verify the effectiveness of random dither quantization method for nonlinear feedback control systems. For this purpose, we consider the attitude stabilization problem of satellites using discrete-level actuators. Namely, this paper provides a control method based on the random dither quantization method for stabilizing the attitude of satellites using discrete-level actuators.

Keywords: Quantized control, nonlinear systems, random dither quantization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 913
927 Developing Efficient Testing and Unloading Procedures for a Local Sewage Holding Pit

Authors: Esra E. Aleisa

Abstract:

A local municipality has decided to build a sewage pit to receive residential sewage waste arriving by tank trucks. Daily accumulated waste are to be pumped to a nearby waste water treatment facility to be re-consumed for agricultural and construction projects. A discrete-event simulation model using Arena Software was constructed to assist in defining the capacity of the system in cubic meters, number of tank trucks to use the system, number of unload docks required, number of standby areas needed and manpower required for data collection at entrance checkpoint and truck tank load toxicity testing. The results of the model are statistically validated. Simulation turned out to be an excellent tool in the facility planning effort for the pit project, as it insured smooth flow lines of tank trucks load discharge and best utilization of facilities on site.

Keywords: Discrete-event simulation, Facilities Planning, Layout, Pit, Sewage management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1637